MENU

Graph AI chip challengers drive AI as a service

Graph AI chip challengers drive AI as a service

Feature articles |
By Nick Flaherty


The challenge of breaking into the data centre to provide AI services is driving chip makers to new business models.

Two leading AI chip designers, Graphcore and Blaize, have partnered with data centre operators to offer their chips to customers through AI as a service (AIaaS), including free access.

Bristol-based Graphcore is working with Gcore (previously Gcore Labs), on a new AI cloud cluster in Amsterdam, and Paperspace has put the Graphcore’s Intelligence Processing Units (IPUs) in its Gradient platform and is offering access to the chips for free for building, training, and deploying machine learning models.

Gcore’s AI Cloud is an AI-Infrastructure-as-a-Service, and the launch will provide smaller companies with the IT resources of a tech giant, enabling them to develop AI with less investment and minimal technical set-up. 

Blaize, which has key design teams in the UK, has also teamed up with OrionVM to create an AI as a service (AIaaS) offering. This provides access to the Blaize Graph Streaming Processor (GSP) chips for edge AI and sensor fusion applications cards on the OrionVM cloud platform.

Related articles 

The chip makers are trying to tap into the growing market for AI development. According to International Data Corporation (IDC), worldwide spending on AI will increase from $50.1 billion in 2020 to more than $110 billion by 2024. Digital healthcare, manufacturing, and retail businesses are likely to expand their use of edge computing by 2028, according to the Linux Foundation’s State of the Edge report. 

Dedicated AI environments for clients can be easily created with the virtualized Blaize GSPs. For example, they can be integrated into video surveillance technology on the edge providing sophisticated and constantly updated analysis of events. 

The latest version of Blaize AI Studio is now available on the OrionVM cloud platform, allowing for the development of AI applications that can quickly be set up to perform workloads without needing to purchase and configure complex hardware environments.

Related Blaize articles

“Before the availability of next-gen cloud solutions like ours, AI was cost-prohibitive owing to steep infrastructure spend and a shortage of qualified programmers. Now, companies do not need to build their clouds, or rely on inflexible and expensive public clouds to build, test, and utilize their artificial intelligence systems,” said Daniel Pfeiffer, COO and VP Partnerships for OrionVM.

“They can now take advantage of data insights through AIaaS without expensive up-front investments. This allows them to harness the power of machine learning at significantly lower costs,”. “We are excited to see offerings like Blaize’s AIaaS provide cloud advantages such as enterprise security and the ability to instantly deploy and scale.”

Blaize is targeting security and video surveillance as well as smart retail and smart city and transportation services as well as AI for life sciences and healthcare. 

“We built our AI solutions with a deep understanding of where AI technology began and where it can go. Our innovative approach has helped companies across various industries because we address their need for products purpose-built for the requirements of edge AI,” said Dinakar Munagala, co-founder and CEO of Blaize.

“Our solutions allow customers flexibility by programming AI solutions to fit their specific requirements. Our advanced code-free AI software also uniquely implements “edge-aware” transfer learning and optimizations for higher accuracy post-model compression. The possibilities are almost limitless.”

As more companies begin to build Artificial Intelligence applications, investing in the infrastructure required to host such applications can be a roadblock to development. Gcore’s AI-Infrastructure-as-a-Service, providing even small companies with the IT resources of a tech giant, enabling them to develop AI with less investment and minimal technical set-up.

“We are excited to be the cloud provider which is building the first European AI infrastructure. With our British partner Graphcore, we’re making it easier to integrate innovations for all kinds of businesses, small or large. Our second AI Cloud cluster is an important step in this journey”, comments Andre Reitenbach, CEO of Gcore.

Gcore’s AI Cloud makes the Graphcore IPUs available for businesses of all sizes – speciality AI hardware, previously only available to enterprise companies, can be rented by a start-up on as little as a per-minute basis. The AI cloud also gives enterprises access to software tools and integrations to support work involving ML and AI such as TensorFlow, Keras, PyTorch, Hugging Face, Paddle and ONNX.

Alongside Gcore’s existing cluster in Luxembourg, the new cloud will allow enterprises to accelerate their AI and Machine Learning development across Western Europe. Gcore, which has already built a global cloud presence in over fifteen locations around the globe, plans on further expanding the locations of its AI Cloud in the future.

Paperspace user can also run state-of-the-art models on Graphcore IPUs in the cloud with pre-configured IPU-optimised runtimes. This eliminates the need to install drivers and SDKs, or run setup scripts to get started. Developers worldwide can now run Graphcore tutorials and sample projects to explore the advantages of the IPU and the Poplar SDK.

“Paperspace has long been known as the cloud platform of choice for ML developers and is gaining significant traction with AI engineers working in AI-first platform companies, scale-ups, and large enterprise businesses, which makes it a great partner for Graphcore. This partnership is a win-win for the machine learning community, combining the proven performance advantages of Graphcore’s IPU technology with the ease of access and powerful development environment provided by Paperspace’s Gradient cloud platform,” said Graphcore co-founder and CEO Nigel Toon.

“The joint offering we are launching today provides the perfect way to learn how to optimise models for an IPU and to experience AI applications where the IPU really shines, like Graph neural networks and Transformer models for vision and natural language processing, all in an easy to use Jupyter notebook environment.”

From start to finish, setting up IPU access with Paperspace has been designed to be simple with pre-built Docker containers through the intuitive and familiar browser environment of Gradient Notebooks, a web-based Jupyter IDE running on IPU-POD16 Classic machines delivering 4 petaFLOPS of AI compute.

Other Graphcore articles

To get started, simply select an IPU Runtime within the Gradient workspace which comes with pre-loaded docker container, code, and datasets. Users can access a curated list of the latest and greatest models showing the advantage of IPUs — including NLP, Computer Vision, and GNNs.

Users can browse and replicate a range of published model benchmarks out of the box without having to worry about setting up their development environment. This has been made even easier through the integration of a “Run on Gradient” button linking directly from the Graphcore Model Garden.

Available models include BERT-Large, RoBERTa, ViT (Vision Transformer), Cluster-GCN, TGN (Temporal Graph Network) and SchNet, a GNN-based model developed for modelling quantum interactions between atoms in a molecule

Paperspace eliminates the need to work with on-premise hardware or set up and maintain a work environment. Plus, since it’s free there is no need to commit to a future paid subscription of any kind.

www.blaize.com; www.graphcore.ai; www.gcore.com;  www.orionvm.com/BlaizeAI

Other articles on eeNews Europe


Share:

Linked Articles
10s