Home / Services / AI Research Support

As ASU researchers set out to solve humanity’s greatest challenges, our goal is to provide the best tools to empower them to succeed. AI is an essential part of the modern toolkit, offering new opportunities to advance research that matters to our communities today.

Knowledge Enterprise RTO and Research Computing at ASU provide world-class AI models, tools and high-performance computing resources to faculty, students, staff and community members.

By opening access to tools and platforms across all disciplines, we foster an inclusive environment for AI research that joins scientific, humanistic and artistic exploration.

Our infrastructure, which includes the Sol supercomputer, offers the power and flexibility to push boundaries in AI research — whether you are developing large language models (LLMs), accelerating AI-driven software, or applying deep learning techniques to complex problems.

Boosting research with AI tools

Studying and sorting through large datasets is an important part of research. This is exactly where AI models offer a major advantage — they are ideally suited to analyze huge amounts of information with speed.

Research Computing provides local access to powerful open-source AI models on the Sol supercomputer, displayed in the table below. Researchers can use these models separately or in combination to examine datasets stored locally. This improves data security by eliminating the need for cloud uploads. 

Sol also carries a suite of AI-powered tools, including MONAI, AlphaFold3, TensorFlow and PyTorch. These provide faculty, students and staff with a robust computing environment for AI-enabled research.

ASU’s advanced computing resources not only drive innovation and discovery in AI research, they contribute to numerous scientific publications across various disciplines. Researchers leveraging ASU’s Sol supercomputer, local AI models and GPU acceleration have produced impactful studies in areas such as machine learning, large language models, biomedical AI and computational science.

Choosing the right AI model: What to know

Before diving into the model comparisons, here are terms to help interpret the table:

  • Parameters: Parameters are the “brainpower” of an AI model. The more parameters a model has, the more information it can process and the more complex patterns it can learn.
  • Context window: This refers to how much text a model can consider at once. A larger context window allows the model to understand and respond to longer inputs. This is useful for analyzing documents, conducting in-depth conversations or performing multi-step reasoning.

Open-source AI models on the Sol supercomputer

ModelRelease dateSpecialties and featuresParametersContext window
ClimateGPTJanuary 2024Climate science-focused, adapted
 from Llama 2 with domain-specific training
7B, 13B, 70B4k
Llama 3April 2024Improved reasoning, instruction-following, multilingual support8B, 70B8k
NVLMSeptember 2024Vision-language model excels in OCR, multimodal reasoning and coding72BNA
Llama 3.3December 2024Optimized for multilingual dialogue, comparable to Llama 3.2 (405B)70B128k
DeepSeek R1January 2025Mixture of experts, strong reasoning, general-purpose and specialized tasks1.5B–70B (distilled), 671B (MoE, 37B active)128k
Falcon 3February 2025Focus on science, math and coding, with various instruction-tuned and base models1B, 3B, 7B, 10B32k
Gemma 3March 2025Multimodal (text and image) instruction-tuned versions1B, 4B, 12B, 27B32k (1B), 128k (others)

Model selection tips

The best model for your research depends on your needs. Here are some general guidelines:

  • For fast, resource-efficient tasks, consider smaller models like Falcon or LLAMA with fewer parameters.
  • For working with long documents or multi-step prompts, choose a model with a large context window (e.g., LLAMA 3.3, DeepSeek-V3, or Grok-3).
  • For research requiring high performance in reasoning, coding or technical language, Grok-3 or DeepSeek-V3 are good choices.
  • For projects requiring integration with NVIDIA GPU infrastructure, consider NVIDIA NVLM.
  • For multilingual work, LLAMA 3.3 supports multiple languages and is good for cross-language tasks.

If you’re unsure where to start, LLAMA models offer a solid balance of performance and flexibility across many research domains.

Getting started

All AI models listed on this page are freely available for ASU researchers on the Sol supercomputer. To use the AI models, you must first request access to the Sol supercomputer. Faculty and research staff may request accounts directly. Students and non-faculty users must be sponsored by an ASU faculty member. Once your account is approved, you will receive onboarding instructions and access credentials.

Next, to begin using the models, schedule a consultation with the Research Computing team. We’ll walk you through accessing the Sol supercomputer via the web portal, launching a Jupyter Notebook and selecting the appropriate tools for your research. This personalized guidance will help ensure you’re set up for success — especially if you’re new to working with AI models in a supercomputing environment.

Bringing AI to the classroom

ASU equips students to engage in hands-on AI learning, giving them valuable experience with technology that continues to shape the modern workforce. Students can access AI applications on the Sol supercomputer if their instructors request student accounts from Research Computing, which provides AI computing support for academic courses.

Students with accounts can immediately begin working with cutting-edge AI tools without instructors needing to handle software setup. Research Computing pre-configures software environments, including several Python environments and ready-to-use Jupyter notebooks integrated with AI models. 

Below is a sample of AI-related courses supported by Research Computing in the last academic year.

Course NumberCourse NameCourse Date
EEE 549Statistical Machine Learning: From Theory to PracticeFall 2024
CSE 575Statistical Machine LearningFall 2024
CSE 598Frontier topics in GenAIFall 2024
EEE598Algorithm/Hardware CoDesign/Design Automation for Emerging AI HardwareFall 2024
EEE598Deep Learning: Foundations and ApplicationsFall 2024
CSE 576Topics in Natural Language ProcessingFall 2024
CSE 575Statistical Machine LearningFall 2024
EEE 598Generative AI: Theory and PracticeSpring 2025
CIS 508Machine Learning in BusinessSpring 2025
CEN 524, CSE 524, CSE 494Machine Learning AccelerationSpring 2025
CSE 575Statistical Machine LearningSpring 2025
CSE 476Introduction to Natural Language ProcessingSpring 2025
MFG 598AI in Additive ManufacturingSpring 2025
CSE 576Topics in Natural Language ProcessingSpring 2025
FIN 597AI and Machine Learning Applications in FinanceSpring 2025

Events

Research Computing provides cutting-edge resources, training and events to empower researchers in the rapidly evolving field of AI.

AI training and workshops

For a full list of upcoming workshops, visit the Research Computing documentation site.


Research Computing hosts two annual flagship events, GPU Day and AI Ignition, in addition to dozens of other interactive training sessions each year. Sessions cover a range of AI topics, including:

  • Harnessing the potential of large language models for innovation
  • Accelerating research with GPUs
  • Advanced research acceleration with GPUs
  • Python: Machine learning
  • Python: Deep learning

Explore past workshop materials and recordings on the Research Computing Expo page.

A safe AI environment

Many scientists want to leverage AI but also need to protect proprietary data. The university’s OpenAI API, offered through Research Computing, empowers ASU researchers to integrate OpenAI’s advanced models into their projects while ensuring their data remains private and owned by ASU.

This service supports applications in natural language processing, data analysis and AI-driven innovation. ASU provides an initial trial period at no cost. After the trial period, researchers are responsible for paying for continued access.

Did you know?

In 2020, ASU was named a Dell Technologies High Performance Computing and Artificial Intelligence Center of Excellence. This recognition gave the university access to a worldwide program that facilitates the exchange of ideas among researchers, computer scientists, technologists and engineers for the advancement of high-performance computing and AI solutions. ASU later leveraged its relationship with Dell Technologies to co-design and build the Sol supercomputer announced in 2022.