Four Automotive Functional Safety Mistakes You Must Avoid | Heisener Electronics
Contactez nous
SalesDept@heisener.com +86-755-83210559 ext. 816
Language Translation

* Please refer to the English Version as our Official Version.

Four Automotive Functional Safety Mistakes You Must Avoid

Technology Cover
Date de Parution: 2022-05-30, TinyCircuits

Automotive stakeholders such as OEMs and Tier 1 suppliers must view functional safety (FuSa) as an organization-wide practice. Easier said than done, implementing ISO 26262 compliant FuSa presents its own set of challenges. Failure to address these challenges can lead to project management errors that burden the project with delays and rising costs. Poor management can be related to an overall lack of security awareness in the organization or poor coordination among cross-functional teams.

In the automotive ecosystem, the negligence of one stakeholder can also affect other stakeholders. If Tier 1 suppliers do not conduct hazard analysis in an extensive manner, architectural designs can be fraught with unidentified hazards and associated risks. Likewise, working on safety-critical projects with resources that have not been trained in the ISO 26262 standard has its own perils.

1 Lack of security awareness in the organization

Functional safety is not limited to safety teams working on safety-critical automotive projects. From developers and test engineers to project managers, every team member must understand the ISO 26262 standard and its guidelines. Let's take a look at some FuSa mistakes made when there is an overall lack of security awareness in an organization.

——Missing Safety Culture: A safety culture essentially means that every stakeholder in automotive software or hardware development takes functional safety seriously. Without ignoring any danger, focusing on each stage of the security life cycle, resources are coordinated and work together. Simply having a functional safety manager/consultant without focusing on building a safety culture is the most common mistake organizations make.

——Focus on documentation, not security: Documentation is an important part of ISO 26262 compliance. These documents serve as evidence when the OEM certifies. However, focusing solely on documentation rather than actual security requirements, goals, and mechanisms is counterproductive.

——Assumption-based ASIL determination: Determining the Automotive Safety Integrity Level (ASIL) value for an automotive module without performing a Hazard Analysis and Risk Assessment (HARA) has been seen as a common practice that must be avoided. It is not recommended to assume an ASIL based on industry specifications as this may lead to a hazard of omission. For example, infotainment systems are often considered ASIL B. Therefore, many infotainment development companies do not implement HARA, but see ASIL B as their solution. What if the infotainment system also contained camera data that could be used to automate certain actions in the vehicle? This is a serious security risk and is ignored due to assumptions.

Figure 1 HARA as a process is the culmination of the ISO 26262 prescriptive framework and the team's understanding of functional safety and automotive functionality.

2 Examples of mismanagement of safety caused by undermining functional safety

Some automotive suppliers or technology providers understand the ISO 26262 standard and its nuances. However, in order to avoid costs and reduce time-to-market, they often compromise the functional safety of certain safety-critical components. It appears that accidental ignorance of safety requirements or hazards could endanger the lives of vehicle occupants.
——Underestimating the overall project timeline/effort: Once safety criticality is put into the picture, along with the ISO 26262 standard, the effort increases for obvious reasons. With the effort, the time line has also been extended. Typically, ASIL A means a 10-15% increase in workload, rising to 100% for ASIL D compliant projects. Underestimating this effort without considering safety requirements and goals is another ISO 26262 compliance mistake to avoid. Projects start to suffer when companies try to squeeze into the implementation part to comply with pre-determined arbitrary deadlines.

——Consider security at the end of the product life cycle: As mentioned earlier, the architectural design of an ISO 26262 solution is created based on software requirements and security requirements. When you develop automotive solutions designed to comply with the ISO 26262 standard, the standards guidelines must be followed from the beginning of the product life cycle. Incorporating these guidelines at the end of the lifecycle or in the second iteration proved to be a huge mistake due to the following factors:

1 Design rework is heavy because the original design did not contain safety aspects.

2 Legacy code cannot be reused because it is not ISO 26262 compliant. Checking preconditions, using wrappers between modules that send/receive signals, and introducing new modules means that a lot of new code may need to be written.

3 Sometimes, the entire design needs to be changed, which can lead to changes in the microcontroller platform. That means designing products from the ground up.

——Insufficient investment in tools and engineering skill sets: Many organizations believe that having a functional safety manager is sufficient to ensure compliance with ISO 26262. There is often a certain degree of insensitivity to security attitudes. This may be in terms of using qualified tools or improving resource skills. It is always recommended to train every resource associated with every ISO 26262 compliant project. From developers and testers to project managers, every stakeholder must have a good grasp of the practices in the ISO 26262 standard.

                                        Figure 2 Functional safety, no longer an afterthought, takes on a life of its own in vehicle design.

3 Mismanagement of FuSa due to poor coordination among stakeholders

ISO 26262 compliant projects involve resources from different teams and with different skills. There are developers, test engineers, hardware experts, project managers, functional safety managers, etc.

——Lack of coordination among teams: Different teams within an organization need to coordinate to accomplish various security activities. For example, to perform Hardware Failure Mode Effects and Diagnostic Analysis (FMEDA), the software team must have a clear understanding of the safety mechanisms. Sometimes teams don't understand the importance of this collaboration.

——Poor coordination between OEMs and Tier 1 suppliers: In some cases, OEMs cannot provide adequate support and preparation for security compliance. Hazards are skipped, security analysis is not performed properly, and all sorts of such mismanagement ensue. In addition, the OEM's failure to assess the tooling capabilities of the Tier 1 supplier proved to be detrimental to the project.

4 A mix of technical and managerial errors

Some limitations arise due to lack of budget or overrunning project management and starting to interfere with common ISO 26262 knowledge in technology. Let's take a look at them.

——Setting the bar for safety-critical systems too low: When you start ignoring hazards and loosening acceptance criteria, your project is at risk. For example, there may only be one safety issue in the module that requires ASIL C. However, you choose to ignore it and stick to ASIL B. This is a serious mistake made by the organization. Rising costs are the main cause of such errors. Testing all extreme test cases, performing additional security analysis, and investing in tool licenses can increase project costs. There is also a risk of burning the board, LEDs and motors when testing. Nonetheless, these faults must be checked for safety.

——The importance of related standards such as SOTIF and ASPICE is underestimated: in addition to functional safety, other standards such as ASPICE and Cybersecurity (ISO/SAE 21434) are followed according to the requirements of the project. Since all of these standards deal with coding and testing guidelines, there is a lot of overlap between them. The most common mistake in this area is not considering the interrelationships between these standards when developing a security plan. There are many activities that can be run in parallel or even combined to save time. For example, the software qualification testing recommended by ASPICE is similar to the software integration testing and Capability Maturity Model Integration (CMMI) recommended by ISO 26262. In principle, they both examine the high-level architecture of the software. Templates for such similar processes can be combined to save a lot of time and effort. Another common mistake made when using different standards is ignoring the effect of one standard on the other.

——Overengineering: Not every automotive module is critical to safety. You will only know the criticality when you perform HARA. Organizations sometimes do not want to perform all safety activities and assume a higher ASIL rating for a module just for safety. This leads to implementing security mechanisms that are not even needed. Such practices must be avoided to optimize cost and time to market.

ISO 26262 is a broad standard and organizations cannot achieve the maturity level of functional safety practices in a short period of time. However, by avoiding the mistakes listed above, they can speed up the process.