Home / Services / AI Research Support
As ASU researchers set out to solve humanity’s greatest challenges, our goal is to provide the best tools to empower them to succeed. AI is an essential part of the modern toolkit, offering new opportunities to advance research that matters to our communities today.
Knowledge Enterprise RTO and Research Computing at ASU provide world-class AI models, tools and high-performance computing resources to faculty, students, staff and community members.
By opening access to tools and platforms across all disciplines, we foster an inclusive environment for AI research that joins scientific, humanistic and artistic exploration.
Our infrastructure, which includes the Sol supercomputer, offers the power and flexibility to push boundaries in AI research — whether you are developing large language models (LLMs), accelerating AI-driven software, or applying deep learning techniques to complex problems.
Boosting research with AI tools
Studying and sorting through large datasets is an important part of research. This is exactly where AI models offer a major advantage — they are ideally suited to analyze huge amounts of information with speed.
Research Computing provides local access to powerful open-source AI models on the Sol supercomputer, displayed in the table below. Researchers can use these models separately or in combination to examine datasets stored locally. This improves data security by eliminating the need for cloud uploads.
Sol also carries a suite of AI-powered tools, including MONAI, AlphaFold3, TensorFlow and PyTorch. These provide faculty, students and staff with a robust computing environment for AI-enabled research.
ASU’s advanced computing resources not only drive innovation and discovery in AI research, they contribute to numerous scientific publications across various disciplines. Researchers leveraging ASU’s Sol supercomputer, local AI models and GPU acceleration have produced impactful studies in areas such as machine learning, large language models, biomedical AI and computational science.
Choosing the right AI model: What to know
Before diving into the model comparisons, here are terms to help interpret the table:
- Parameters: Parameters are the “brainpower” of an AI model. The more parameters a model has, the more information it can process and the more complex patterns it can learn.
- Context window: This refers to how much text a model can consider at once. A larger context window allows the model to understand and respond to longer inputs. This is useful for analyzing documents, conducting in-depth conversations or performing multi-step reasoning.
Open-source AI models on the Sol supercomputer
| Model | Release date | Specialties and features | Parameters | Context window |
|---|---|---|---|---|
| ClimateGPT | January 2024 | Climate science-focused, adapted from Llama 2 with domain-specific training | 7B, 13B, 70B | 4k |
| Llama 3 | April 2024 | Improved reasoning, instruction-following, multilingual support | 8B, 70B | 8k |
| NVLM | September 2024 | Vision-language model excels in OCR, multimodal reasoning and coding | 72B | NA |
| Llama 3.3 | December 2024 | Optimized for multilingual dialogue, comparable to Llama 3.2 (405B) | 70B | 128k |
| DeepSeek R1 | January 2025 | Mixture of experts, strong reasoning, general-purpose and specialized tasks | 1.5B–70B (distilled), 671B (MoE, 37B active) | 128k |
| Falcon 3 | February 2025 | Focus on science, math and coding, with various instruction-tuned and base models | 1B, 3B, 7B, 10B | 32k |
| Gemma 3 | March 2025 | Multimodal (text and image) instruction-tuned versions | 1B, 4B, 12B, 27B | 32k (1B), 128k (others) |
Model selection tips
The best model for your research depends on your needs. Here are some general guidelines:
- For fast, resource-efficient tasks, consider smaller models like Falcon or LLAMA with fewer parameters.
- For working with long documents or multi-step prompts, choose a model with a large context window (e.g., LLAMA 3.3, DeepSeek-V3, or Grok-3).
- For research requiring high performance in reasoning, coding or technical language, Grok-3 or DeepSeek-V3 are good choices.
- For projects requiring integration with NVIDIA GPU infrastructure, consider NVIDIA NVLM.
- For multilingual work, LLAMA 3.3 supports multiple languages and is good for cross-language tasks.
If you’re unsure where to start, LLAMA models offer a solid balance of performance and flexibility across many research domains.
Getting started
All AI models listed on this page are freely available for ASU researchers on the Sol supercomputer. To use the AI models, you must first request access to the Sol supercomputer. Faculty and research staff may request accounts directly. Students and non-faculty users must be sponsored by an ASU faculty member. Once your account is approved, you will receive onboarding instructions and access credentials.
Next, to begin using the models, schedule a consultation with the Research Computing team. We’ll walk you through accessing the Sol supercomputer via the web portal, launching a Jupyter Notebook and selecting the appropriate tools for your research. This personalized guidance will help ensure you’re set up for success — especially if you’re new to working with AI models in a supercomputing environment.
Bringing AI to the classroom
ASU equips students to engage in hands-on AI learning, giving them valuable experience with technology that continues to shape the modern workforce. Students can access AI applications on the Sol supercomputer if their instructors request student accounts from Research Computing, which provides AI computing support for academic courses.
Students with accounts can immediately begin working with cutting-edge AI tools without instructors needing to handle software setup. Research Computing pre-configures software environments, including several Python environments and ready-to-use Jupyter notebooks integrated with AI models.
Below is a sample of AI-related courses supported by Research Computing in the last academic year.
| Course Number | Course Name | Course Date |
|---|---|---|
| EEE 549 | Statistical Machine Learning: From Theory to Practice | Fall 2024 |
| CSE 575 | Statistical Machine Learning | Fall 2024 |
| CSE 598 | Frontier topics in GenAI | Fall 2024 |
| EEE598 | Algorithm/Hardware CoDesign/Design Automation for Emerging AI Hardware | Fall 2024 |
| EEE598 | Deep Learning: Foundations and Applications | Fall 2024 |
| CSE 576 | Topics in Natural Language Processing | Fall 2024 |
| CSE 575 | Statistical Machine Learning | Fall 2024 |
| EEE 598 | Generative AI: Theory and Practice | Spring 2025 |
| CIS 508 | Machine Learning in Business | Spring 2025 |
| CEN 524, CSE 524, CSE 494 | Machine Learning Acceleration | Spring 2025 |
| CSE 575 | Statistical Machine Learning | Spring 2025 |
| CSE 476 | Introduction to Natural Language Processing | Spring 2025 |
| MFG 598 | AI in Additive Manufacturing | Spring 2025 |
| CSE 576 | Topics in Natural Language Processing | Spring 2025 |
| FIN 597 | AI and Machine Learning Applications in Finance | Spring 2025 |
Events
Research Computing provides cutting-edge resources, training and events to empower researchers in the rapidly evolving field of AI.
AI training and workshops
For a full list of upcoming workshops, visit the Research Computing documentation site.
Research Computing hosts two annual flagship events, GPU Day and AI Ignition, in addition to dozens of other interactive training sessions each year. Sessions cover a range of AI topics, including:
- Harnessing the potential of large language models for innovation
- Accelerating research with GPUs
- Advanced research acceleration with GPUs
- Python: Machine learning
- Python: Deep learning
Explore past workshop materials and recordings on the Research Computing Expo page.
A safe AI environment
Many scientists want to leverage AI but also need to protect proprietary data. The university’s OpenAI API, offered through Research Computing, empowers ASU researchers to integrate OpenAI’s advanced models into their projects while ensuring their data remains private and owned by ASU.
This service supports applications in natural language processing, data analysis and AI-driven innovation. ASU provides an initial trial period at no cost. After the trial period, researchers are responsible for paying for continued access.
Did you know?
In 2020, ASU was named a Dell Technologies High Performance Computing and Artificial Intelligence Center of Excellence. This recognition gave the university access to a worldwide program that facilitates the exchange of ideas among researchers, computer scientists, technologists and engineers for the advancement of high-performance computing and AI solutions. ASU later leveraged its relationship with Dell Technologies to co-design and build the Sol supercomputer announced in 2022.