//php echo do_shortcode(‘[responsivevoice_button voice=”US English Male” buttontext=”Listen to Post”]’) ?>
I spy with my little eye…a thousand other eyes looking back at me. Almost everywhere we look in today’s world, we find image sensors. The typical smartphone has at least two, and some contain up to five. Devices such as security cameras, in-car cameras, smart doorbells, and baby monitors can look in all directions, keep a watchful eye open 24 hours a day, and alert us when something happens. Drones use cameras to see where they are going and to capture images that are sometimes utilitarian, sometimes breathtaking, and sometimes otherwise impossible for humans to see. And these are just the sensors most of us see and use in our daily life. Sensors also perform manufacturing and quality control on manufacturing lines around the world, scan deep within our bodies to detect disease or injury without surgery, and monitor thousands of planes, trains, ships, and trucks every day.
Image sensor capacity, performance, cost, and usability have all radically improved in the past decade. Every sensor technology change opens up more possibilities for how and where they can be used. Every new sensor application seems to spark yet another idea of making some aspect of our lives easier, safer, faster, or more enjoyable. For example, image sensors connected to the internet, which includes all those cell phones, doorbell cameras, and drones, is one of the fastest growing segments of this market, with a 34% compound annual growth rate (CAGR) through 2025 (figure 1).
One of the most obvious growth trends in sensor technology is in image sensor pixel count. in 2007, the original iPhone launched with a 2MP camera. By 2014, the iPhone 6 camera was at 8MP. Today the iPhone 13 Pro has a 12MP sensor, representing a 12% annual CAGR over the years. Of course, that isn’t even close to the highest pixel count on a smartphone. The Samsung Galaxy S22 Ultra has a 108MP sensor, which puts the CAGR for their sensor MP at 33% over the last 14 years. And this growth isn’t stopping—there are many projects in the works for 300+MP image sensors.
Image sensors are also dramatically growing in complexity on multiple levels, driven by four main system objectives— image quality, power, capability, and cost. Rarely are these constraints independent, and each chip design prioritizes different objectives, depending on the intended application space.
These prioritized system objectives drive different design choices for the image sensor chip and the overall system. Now that 2.5D and 3D stacking of chips is a commodity process, many image sensor companies are taking advantage of layering to drive new image sensor solutions (figure 2). The technology in these stacked solutions typically fall into two broad categories—in-pixel processing to improve the quality and reduce the cost of the image capture itself, and on-die processing to minimize power and cost and improve security .
Researchers at Harvard recently developed an in-sensor processor that can be integrated into commercial silicon image sensor chips. On-die image processing, in which important features are extracted from raw data by the image sensor itself instead of a separate microprocessor, speeds up visual processing . Sony just released a sensor that can simultaneously output full-pixel images and high-speed “regions of interest.” This combination of sensor and on-die processing allows the solution to simultaneously output an entire scene, like a traffic intersection, as well as high-speed objects of interest, such as license plates or faces, greatly reducing overall system communication bandwidth while increasing response time.
In another approach to sensor integration, Sony’s newest image sensor design  separates the photodiodes and pixel transistors that are normally placed on the same substrate, places them on different substrate layers, and then integrates them together with 3D stacking. The result is a sensor that approximately doubles the saturation signal level (essentially its light-gathering capability) to significantly improve the dynamic range and reduce noise. Sony expects this technology will enable increasingly high-quality imaging in smartphone photography without necessarily increasing the size of the smartphone sensor. Of course, there is also a set of solutions that combines both methods, such as augmented reality (AR) image sensors, with their combination of lowest power, best performance, and minimal form factor .
And then there is the quanta image sensor (QIS)—a new technology developed by Gigajot in which the image sensor contains hundreds of millions to billions of small pixels with photon-number resolving and high dynamic range (HDR) capabilities. The QIS will potentially enable ultra-high pixel resolution that provides a far superior imaging performance compared to charge coupled devices (CCD) and conventional complementary metal oxide semiconductor (CMOS) technologies. While the QIS is not yet in commercial production, test chips have been fabricated using a CMOS process with two-layer wafer stacking and backside illumination, resulting in a reliable device with an average read noise of 0.35 e‑ rms at room temperature operation .
Not surprisingly, as the complexity of these image sensor chips increases, design, verification, and reliability become more challenging. As image sensor and compute elements come together, the traditional rule of separating analog and digital power domains is, of necessity, violated. Analog and digital components operating within the same pixel must be accurately modeled for functionality, performance, and reliability. This analysis must include both the dynamic loading of the power grid from both sensing and computation functions, and how those functions generate heat and draw current. All of this processing generates more heat and draws additional current from the power grid, both of which can degrade the pixel’s ability to capture light.
According to Dr. Eric R Fossum, Krehbiel Professor for Emerging Technologies at Dartmouth, Queen Elizabeth Prize Laureate, and one of the world’s leading experts in CMOS image sensors,
“Power management by design is very important in image sensors, especially those with high data rates or substantial on-chip computation. Power dissipation leads to heating, which in turn increases the ‘dark signal’ in the pixel photodiodes—the signal generated even when there is no light. Since each pixel’s dark signal may be different, an additive ‘fixed-pattern image’ of background signal is generated that is difficult to calibrate out of the desired image. The dark signal also contains temporal noise that further affects the low-light imaging capability of the image sensor. The addition of mixed-signal and digital-signal processing and computing in a 3D stacked image sensor further exacerbates the heating problem. Design tools to simulate and manage power dissipation are helpful to eliminate these sources of image quality deterioration during the design process.”
These complexities mean that voltage (IR) drop and electromigration (EM) analysis can’t be left until the very end of the design cycle as a “checkbox signoff,” because the market risk of performance or chip manufacturing failures is too high. Thorough EM and IR analysis must now be an integral part of the image sensor design flow, which includes the image sensor and its data channel, as well as the high-performance processing connected to the sensor on the same die.
However, such analysis is complicated by the large amount of analog content in these image sensors. Image sensor designers must be able to analyze and verify both analog and digital power integrity, which means analyzing analog designs in the 10s or 100s of millions of transistors—far beyond the 1-2M transistors that existing tools can handle. While traditional digital EM/IR analysis tools can easily process the digital portions of these die, a complete and scalable EM/IR solution for analog content has been lacking.
Super-block and chip-level analysis have traditionally been performed manually, using simplifications such as sub-setting of the design, employing less accurate simulators, and other ad hoc methods and approximations, all of which consume large amounts of engineering time to make up for the lack of automated tool support. Neither static analysis nor hand calculations provide the full coverage or confidence of simulation-based signoff. In addition, existing tools tend to create large numbers of false errors for typical analog layouts, requiring even more time and resources for debugging. This lack of detailed automated EM/IR analysis for large-scale analog circuits puts the whole image sensor system at risk.
In 2021, Siemens EDA introduced the mPower platform, which brings together analog and digital EM, IR drop, and power analysis in a complete, scalable solution for all designs at all nodes . The mPower Analog high-capacity (HC) dynamic analysis provides EM/IR analysis on circuits of hundreds of millions of transistors—just the thing that these large-scale integrated sensors need. mPower HC dynamic analysis provides full-chip and array analyses from block-level SPICE simulations, giving designers the detailed analyses needed to confidently sign off on these large, complex sensor designs for manufacturing while enabling faster overall turnaround times. It can also enable faster iterations earlier in the design cycle by using pre-layout SPICE simulations. At the same time, the mPower Digital solution provides digital power integrity analysis with massive scalability to enable design teams to analyze the largest designs quickly and accurately. Together, the mPower Analog and Digital tools provide an unparalleled ability to model and analyze the IR drop and EM of a complete integrated sensor system, whether it is on one die or many.
Khandaker Azad, senior manager at ONSEMI in Santa Clara, had this to say after implementing the mPower tool, “We’re seeing significant improvement in the quality of EM/IR signoff by doing high-capacity dynamic EM/IR of the digital and analog blocks with the mPower tool. Its scalability, TCL-based flow, and above all, fast runtimes helped us cut down our turnaround time by severalfold. In summary, the mPower tool certainly brought confidence to our full-chip signoff analysis.”
As sensor designs continue to proliferate and evolve in complexity, the need for a scalable, innovative power integrity analysis solution will continue to grow with them. With the mPower platform, there is finally an IC power integrity analysis tool that is up to the task.
 Zhu, Yuhao. “Opportunities and Challenges of Computing in Die-Stacked Image Sensors.” Computer Architecture Today, The ACM Special Interest Group on Computer Architecture, Jan. 22, 2022. https://www.sigarch.org/opportunities-and-challenges-of-computing-in-die-stacked-image-sensors/
 Harvard John A. Paulson School of Engineering and Applied Sciences, Aug. 25, 2022, “Silicon image sensor that computes,” [Press release]. https://www.seas.harvard.edu/news/2022/08/silicon-image-sensor-computes
 Sony Semiconductor Solutions Group, Dec. 16, 2021, “Sony Develops World’s First Stacked CMOS Image Sensor Technology with 2-Layer Transistor Pixel,” [Press release]. https://www.sony-semicon.com/en/news/2021/2021121601.html
 C. Liu, S. Chen, T. -H. Tsai, B. de Salvo and J. Gomez, “Augmented Reality – The Next Frontier of Image Sensors and Compute Systems,” 2022 IEEE International Solid- State Circuits Conference (ISSCC), 2022, pp. 426-428, doi:10.1109/ISSCC42614.2022.9731584.
 Ma, J., Zhang, D., Robledo, D. et al., “Ultra-high-resolution quanta image sensor with reliable photon-number-resolving and high dynamic range capabilities,” Sci Rep 12, 13869 (2022). https://doi.org/10.1038/s41598-022-17952-z
 Siemens Digital Industries Software, Sept. 28, 2021, “Siemens introduces mPower power integrity solution for analog, digital and mixed-signal IC designs,” [Press release]. https://www.plm.automation.siemens.com/global/en/our-story/newsroom/siemens-mpower-power-integrity-analysis/101904
Joseph Davis is senior director of product management for Calibre interfaces and mPower power integrity analysis tools at Siemens Digital Industries Software, where he drives innovative new products to market. Prior to joining Siemens, Joseph managed yield simulation products and yield ramp projects at several leading semiconductor fabs, directing yield improvement engagements with customers around the world and implementing novel techniques for lowering the cost of new process technology development. Joseph earned his Ph.D. in Electrical and Computer Engineering from North Carolina State University. He can be reached at email@example.com.