AI & ML interests

AI inference, AI in the cloud, AI on edge, software acceleration of AI workloads on hardware, efficient AI deployments, GPU-Free AI inference, AI model optimization.

Recent Activity

About Ampere Computing

Ampere is a modern semiconductor company designing the sustainable future of AI inference computing with the world’s first processors optimized for the cloud. Ampere CPUs are based on the broadly used Arm ISA and come fortified with an AI software ecosystem that enables a seamless transition from the legacy x86 software ecosystem. Mature cross-compatibility, native performance optimization, extensive industry collaboration, developer-friendly resources, demonstrated real-world achievements, and close-knit partnerships with AI software vendors ensure a smooth migration to Ampere for AI applications. Developers can harness the power of Ampere Cloud Native Processors without intricate configurations, reducing setup complexities and accelerating time-to-value for AI projects.

Ampere Optimized AI Frameworks

Ampere helps users achieve superior performance for AI workloads by integrating optimized inference layers into common AI frameworks. Ampere Optimized AI Frameworks are designed to leverage the unique capabilities of Ampere Cloud Native Processors, unlocking a new level of AI inference capabilities. Ampere’s optimized AI frameworks stand as a testament to the synergy between hardware and software, offering many advantages that redefine the AI experience. Ampere’s software acceleration strategy goes beyond providing optimized frameworks, it offers an easy transition from AI model development to deployment. This approach streamlines the AI lifecycle, allowing businesses to swiftly operationalize their AI initiatives.

image/png

Ampere focuses on delivering the most efficient solutions for AI inference, offering seamless integration with all AI applications via the support of all popular AI frameworks in the industry including PyTorch, TensorFlow, and ONNXRuntime. Ampere Optimized AI Frameworks work out of the box and do not require API changes or additional coding. This drop-in library supports all AI applications developed in the most popular frameworks and enables developers to seamlessly deploy their models across Ampere’s platforms.

Our optimized SW components allow an application to control its behavior, like the number of threads and CPU binding, through a set of environment variables. Please refer to the download documentation for details.

Where to Get

The suite of Ampere AI software solutions ensures that AI models run efficiently and deliver optimal performance. Software optimizations directly contribute to achieving the desired efficiency and performance balance in AI inference.

We offer ready-to-use Docker images that can be pulled from Ampere AI developers' website, we include code snippets and documentation along with the link to download after accepting the end user license agreement. Images are also available from our Cloud partners in their marketplaces. The Docker image includes a standard ML framework preinstalled with our optimized SW. It runs on any Ampere processor. You can run your inference scripts without change. Example models like image classification and object detection are provided with the image.

Ampere Model Library (AML)

Ampere Model Library (AML), hosted on Ampere AI's public GitHub repo, is a collection of optimized AI models pretrained on standard datasets. The library contains scripts running the most common AI tasks. The models are available for Ampere customers to quickly and seamlessly build into their applications. Ampere provides individual models under a variety of free and open-source licenses. Each model package will identify the associated license.

AML Benefits Include:

  • Benchmarking AI architecture with different frameworks
  • Accuracy testing of AI models on application-specific data
  • Comparison of AI architectures
  • Conducting tests on AI architectures

image/png

AI Platform Alliance (AIPA)

Ampere Computing is one of the founders of the AI Platform Alliance. AIPA fosters open, efficient, and sustainable use of AI at scale working to validate joint AI solutions that provide a better alternative than the GPU-based status quo to accelerate the pace of AI innovation.

Learn More or Contact us Directly

datasets

None public yet