inf1.6xlarge

Businesses across a diverse set of industries are looking at AI-powered transformation to drive business innovation, improve customer experience and process improvements.

The inf1.6xlarge is a aws ec2 instance type featuring multiple vCPUs and optimized memory, commonly used for ec2 workloads.

Last updated at Thu Apr 30 2026
Instance Info
Compute Value
vCPUs 24
Memory 48 GiB
Physical Processor Intel Xeon Platinum 8275CL (Cascade Lake)
CPU Architecture 64-bit
GPU 4
GPU Memory NA
Network Value
Network Performance 25 Gigabit
Enhanced Networking No
Storage Value
Storage EBS only
EBS Throughput 3500 Mbps
Amazon Value
Current Generation Yes
Instance Type inf1.6xlarge
Family Machine Learning ASIC Instances
Family Instances
Instance vCPUs Memory
inf1.xlarge 4 8
inf1.2xlarge 8 16
inf1.6xlarge 24 48
inf1.24xlarge 96 192
Related Instances
Instance vCPUs Memory
c5n.4xlarge 16 42 GiB
c4.8xlarge 36 60 GiB
c3.8xlarge 32 60 GiB
vt1.6xlarge 24 48 GiB
m8azn.3xlarge 12 48 GiB