As part of that initiative the chip company has published a white paper entitled: The morals of algorithms. In the white paper the company discusses five principles for the development of AI systems; non-maleficence, human autonomy, explicability, continued attention & vigilance, and privacy and security by design.

Recently NXP launched the Glow compiler for the optimization of neural networks running on microcontrollers (see MCU-based implementation of Glow neural network compiler).

“In addition to our strong innovation-minded spirit, ethics are core to who we are as NXP,” said Kurt Sievers, CEO of NXP, in a statement. “As innovators in AI, we are committed to applying ethical principles. Consumers depend on AI for more responsibilities and decision making in their lives, especially at the edge where people want their devices to operate transparently, fairly and safely, while giving them control over their privacy. And security is key – we believe that building trust in AI starts with protecting devices.”

NXP said it plans to develop employee education programs to help them implement the five AI principles discussed in the white paper. The programs will be supported by engagement with academic institutions, research organizations and other commercial technology firms. NXP is also a partner in the Charter of Trust, a cross-industry initiative founded in 2018 to make the digital world of tomorrow safer.

“By building these ethical principles into the devices that sense, interpret, and analyze data at the edge, we can enable AI that acts ethically,” concluded Sievers.

Related links and articles:

The morals of algorithms

News articles:

MCU-based implementation of Glow neural network compiler

Book review: NANOCHIPS 2030 charts key chip technologies

Development environment eases machine learning on to microcontrollers

AI startup claims first steps towards human reasoning

If you enjoyed this article, you will like the following ones: don't miss them by subscribing to :    eeNews on Google News


Linked Articles