//php echo do_shortcode(‘[responsivevoice_button voice=”US English Male” buttontext=”Listen to Post”]’) ?>
The theme of the 5th IBM IEEE AI Compute Symposium (AICS), held at an IBM research center last fall in New York, was “scalability to sustainability.” Symposium presenters from industry and academia covered a range of topics, including device technology, circuits, architecture, algorithms and sustainability to make innovations for the cloud, with an emphasis on green artificial intelligence.
Tamar Eilam, an IBM fellow, gave a talk titled “The Road to Sustainable Computing.” She told participants about IBM’s initiative on sustainable and responsible computing and described how to design a sustainable data center. Key steps include finding pathways to achieve multi-DC sustainability goals by incorporating exogenous factors, natural resources, cooling, water, energy for IT and other constraints within a holistic DC model.
Eilam also emphasized the importance of explainable AI and contra-factual analysis with a focus on capital and operational cost with environmental impacts. Such impacts include capitalized carbon-footprint emissions, the operational carbon footprint and operational water use. Eilam pointed out the need for using renewable energy, such as solar, wind and wave energy.
“The key lies in utilizing sustainable computing by predicting the power consumption, co-optimization of software/system-based behavior and coupling renewable energy,” she said.
Eilam also participated in a panel discussion on sustainability with Robert Muchsel, Analog Devices Inc. (ADI) fellow; Christopher Hill of the Massachusetts Institute of Technology; Prashant Shenoiy of the University of Massachusetts; and Aaron Thean, dean of the College of Design and Engineering at the National University of Singapore (NUS). The discussion focused on sustainability at all levels, including algorithmic and architecture techniques, data center carbon-footprint reduction, product development and applications in which AI can help.
The symposium’s presentations were recorded and are available on IBM’s website. Other talks included:
- “Improving Privacy and Energy Usage by Pushing AI Inference to the Edge of the IoT Frontier,” by ADI’s Muchsel. Although AI may dominate the tech news, most AI solutions are expensive, big and energy-hungry, Muchsel said, and the connected nature of these systems leads to significant concerns related to privacy and system autonomy.
- “Novel Material-System Co-Design Opportunities for Analog-Non-Volatile In-Memory Computing and Reconfigurable Edge-AI,” by NUS’s Thean, who spoke about ultra-low–energy and area-efficient electronic systems that are required to enable untethered computing at the edge of the IoT. To realize self-learning edge-AI systems, conventional solely software-driven deep-learning neural networks become a major roadblock due to the excessive energy expense of training them. Therefore, a fundamental hardware change is likely needed, Thean said.
- “Technology Co-Design and Innovation for the Age of Ambient Intelligence,” by Tsu-Jae King Liu, dean of the College of Engineering at the University of California, Berkeley. As developers reach the practical limits for transistor miniaturization, alternative approaches for improving integrated circuit functionality and energy efficiency at an acceptable cost will be necessary to meet the growing demand for information and communication technology, Liu told the audience. She also showcased how technology co-design and innovation can achieve dramatic improvements in computing performance to usher in the “age of ambient intelligence.”
- “Using Coral for Scalable and Sustainable AI at the Edge,” by Bill Luan, senior program manager of the Coral (a toolkit to build local AI) team at Google. With the advancement in AI research over the past decade, AI/ML technology has expanded from being available only on cloud-based data centers to becoming available on IoT and edge devices, opening huge opportunities for innovation, Luan told the audience. Leading this change is the Coral platform from Google, he said, making deploying AI at the edge on a large scale not only possible but also sustainable.
- “Confluence of AI & Cloud with EDA,” by Arun Venkatachar, VP of AI, cloud and central engineering at Synopsys. Venkatachar said that investments in AI/ML, cloud and big data to solve EDA problems with ever-increasing complexities of chip design are starting to come to light.
- “Automated Synthesis and Architecture [AutoSA] Optimization for Deep Learning Accelerator Designs,” by Jason Cong, director of the Center for Customizable Domain-Specific Computing and director of the VLSI Architecture, Synthesis and Technology (VAST) Laboratory at the University of California, Los Angeles. AutoSA is based on the polyhedral framework and incorporates a set of techniques for both computation and communication optimizations.
- “New Materials for Three-Dimensional Ferroelectric Microelectronics,” by Susan Trolier-McKinstry, a professor of ceramic science and engineering at Pennsylvania State University. In the last decade, there have been major changes in the families of ferroelectric materials available for integration with CMOS electronics, including Hf1-xZrxO2, Al1-xScxN, Al1-xBxN and Zn1-xMgxO, which offer the possibility of new functionalities, Trolier-McKinstry said.
- “Heterogeneous Multi-Core tinyML,” by Marian Verhelst, professor at the MICAS Laboratories of KU Leuven and a research director at imec. Verhelst described approaches for powerful machine inference in resource-scarce distributed devices. Developing intelligent applications at ultra-low energy and low latency requires compact compute and memory structures that have very high utilization, she said. This has resulted in a wide variety of proposed state-of-the-art accelerator designs.
- “Machine Learning for Real: Thinking More Carefully About Efficiency, Loss Functions and GaNs,” by Perceive CEO Steve Teig. “It’s concerning to note the extent to which today’s deep learning relies on folklore—on recipes and anecdotes rather than on scientific principles and explanatory mathematics,” Teig told the audience.
- “AI at the Open Edge,” by Stefanie Chiras, SVP for partner ecosystem success at Red Hat. Complex use cases and game-changing potential collide when AI is delivered at the edge, Chiras said, which creates a perfect petri dish for innovation. Red Hat sees an opportunity to extend the open hybrid cloud, bringing capability all the way out to the far edge, even as far as the International Space Station.