Sign In
Forgot Password?
Sign In | | Create Account

How Sensitive is Your System?

Mike Jensen

Mike Jensen

Posted Sep 10, 2010
0 Comments

Running a Sensitivity Analysis is one of my favorite mechatronic system simulations. I like SystemVision’s ability to analyze my design and tell me which parameters have the most effect on a performance metric of my choosing. Makes designing systems a bit easier if I know which parameters to tweak in order to influence a particular measurement.

Before joining the SystemVision team…in another life with another simulator…I used a standard sensitivity analysis with a traditional flow: change a parameter, run a simulation, measure the results, then compare the measurement with nominal results to determine the sensitivity. Straightforward and simple. But SystemVision lets me dig a little deeper into my system’s sensitivities with two additional sensitivity analyses, Tolerance-Based and Statistical, added to its standard Relative Sensitivity analysis.

SystemVision’s Relative Sensitivity analysis is the standard simulation outlined above. The results tell me the percent change I can expect in a measured metric for a 1% change in parameter value. So if the waveform analyzer returns a relative sensitivity measurement of 0.45 for a performance metric, the default interpretation is “a 0.45% change in performance metric is expected for a 1% change in parameter value”. More generally, a parameter change of N% will lead to a change of sens_result*N% in the measured value, where “sens_result” is the raw number returned by the sensitivity measurement.

The Tolerance-Based Sensitivity analysis adds additional detail. If tolerance information is added to a parameter, and the parameter is included in the sensitivity analysis, the tolerance information is considered in the sensitivity measurements. By combining the Relative Sensitivity and Tolerance-Based Sensitivity analysis results, I end up with some interesting insight into my system. Turns out that a parameter with a high relative sensitivity but a very tight tolerance may have very little affect on my system’s performance metric. Even though the relative sensitivity is high, there isn’t much room for the parameter to vary within its tight tolerance. On the other hand, if a parameter has a low relative sensitivity but its tolerance is very loose, the parameter is more likely to have a measureable effect on my system. Even though the relative sensitivity is low, the parameter has lots of room to vary within its loose tolerance.

Finally, the Statistical Sensitivity analysis might be the most interesting of all. It tells me which parameters contribute most to a performance measurement. In other words, it shows me what percentage of the performance metric variability is due to a specific parameter change. When all of the percentages are added up, the total should be approximately 100% — the more runs I require for the statistical analysis, the closer to 100% the total reaches. Results are interpreted a bit differently from either the Relative or Tolerance-Based sensitivity analyses. To reduce the variability of my design, I need to tighten tolerances on the parameters that have the most effect on my chosen performance metric. To reduce the cost of my design, I can loosen tolerances on the parameters that have little or no effect on my chosen performance metric – less precise components cost less money.

So what does all of this mean? First, and perhaps most important, if I only use the standard sensitivity analysis, I will miss important insights into my system – insights that will not only improve my system’s performance, but also might improve reliability and perhaps reduce overall manufacturing and maintenance costs. Second, no performance metric should be considered in isolation. Optimizing my system for a specific performance metric may have a negative effect on other metrics. I need to make sure I understand the performance metric priorities for my system. Finally, and probably most obvious, a sensitivity analysis should not be used in isolation. It’s just one option in SystemVision’s toolbox for making sure my system works as it should.

Statistical sensitivity, System sensitivity, Relative sensitivity, Tolerance-based sensitivity

More Blog Posts

About Mike Jensen

Mike JensenMost career paths rooted in high technology take many interesting (and often rewarding) twists and turns. Mine has certainly done just that. After graduating in electrical engineering from the University of Utah (go Utes!), I set off to explore the exciting, multi-faceted high tech industry. My career path since has wound its way from aircraft systems engineering for the United States Air Force, to over two decades in applications engineering and technical marketing for leading design automation software companies, working exclusively with mechatronic system modeling and analysis tools. Along the way, I’ve worked with customers in a broad range of industries and technologies including transportation, communications, automotive, aerospace, semiconductor, computers, and consumer electronics; all-in-all a very interesting, rewarding, and challenging ride. In my current gig, I work on technical marketing projects for Mentor Graphics' SystemVision product line. And in my spare time I dream up gadgets and gizmos, some even big enough to qualify as systems, that I hope someday to build -- providing I can find yet a little more of that increasingly elusive spare time. Visit Mike Jensen's Blog

More Posts by Mike Jensen

Comments

No one has commented yet on this post. Be the first to comment below.

Add Your Comment

Please complete the following information to comment or sign in.

(Your email will not be published)

Archives

 
Online Chat