MSI Gaming Radeon RX 570 256-bit 8GB GDRR5 DirectX 12 VR Ready CFX Graphcis Card (RX 570 ARMOR 8G OC)

(3 customer reviews)
Add to compare

Experience the next level of immersion with the world of VR gaming and entertainment with MSI RX 570 ARMOR 8G OC graphics card powered by the revolutionary AMD Polaris architecture. Puts an end to choppy gameplay and broken frames with fluid, artifact-free performance at virtually any framerate….

Best deal at:
Buy for best price
Set Lowest Price Alert
Notify me, when price drops
Set Alert for Product: MSI Gaming Radeon RX 570 256-bit 8GB GDRR5 DirectX 12 VR Ready CFX Graphcis Card (RX 570 ARMOR 8G OC) - $0.00
Add to wishlistAdded to wishlistRemoved from wishlist 0
Last updated on January 28, 2021 3:53 am
MSI Gaming Radeon RX 570 256-bit 8GB GDRR5 DirectX 12 VR Ready CFX Graphcis Card (RX 570 ARMOR 8G OC)
MSI Gaming Radeon RX 570 256-bit 8GB GDRR5 DirectX 12 VR Ready CFX Graphcis Card (RX 570 ARMOR 8G OC)


I got the Graphics Card!! RX570 MSI Armor 8GB Unboxing, Install, Test with Ryzen 2200G

Finally got the graphics card. Really happy with the results! Reviewing, Unboxing & Installing the MSI RX570 Graphics card. RX570 US ...

The 1080p 60FPS PC Gaming Value KING (2019) - MSI Armor AMD Radeon RX 570 8GB Review

The AMD Radeon RX 570 has been on the market since April of 2017 and since that time has become well known as the best bang for the buck graphics card ...

Additional information

Specification: MSI Gaming Radeon RX 570 256-bit 8GB GDRR5 DirectX 12 VR Ready CFX Graphcis Card (RX 570 ARMOR 8G OC)

Memory Speed (MHz)

Graphics Coprocessor

Chipset Brand

Graphics Card Ram Size (GB)

MSI Gaming Radeon RX 570 256-bit 8GB GDRR5 DirectX 12 VR Ready CFX Graphcis Card (RX 570 ARMOR 8G OC) Videos

Price History


Reviews (3)

3 reviews for MSI Gaming Radeon RX 570 256-bit 8GB GDRR5 DirectX 12 VR Ready CFX Graphcis Card (RX 570 ARMOR 8G OC)

4.3 out of 5
Write a review
Show all Most Helpful Highest Rating Lowest Rating
  1. John Borges

    I had to get “Metal Compatible” GPU in order to run the new mac os Mojave beta. Did some research on the web and this was the most cost effective card I could find. Easily recognized by the OS, and as fast or faster than a $350 7950 card. I didnt flash the rom so the screen is blank during the initial bootup. I am keeping my old 5770 card in case I ever need it. Also, make sure you have the proper cable, I was able to get by with an older 1.4 hdmi cable with the 5770, but I had to use a 4k compatible 2.0 cable to get the card to sync with my 3440×1440 monitor.

    Helpful(0) Unhelpful(0)You have already voted this
  2. Marques Alan

    I did not purchase this card for gaming so I cannot review that card function. This graphics card was super easy to install. I just popped it in and it was ready to go. No wires, power supplies and no software installs. I no longer see the boot up screen on my Mac but that’s not really an issue. Works great and now I can install the latest Mac OS 10.14 on my 8 year old Mac. Graphics are smooth and clear. Works great with the Adobe Creative Suite. It’s very quiet too, I cannot hear the fan running.

    Helpful(0) Unhelpful(0)You have already voted this
  3. CBerg

    Update 7 Feb 2019: AMD GPU is diffidently for people that like to tinker with settings. I finally figured it out what was going on with Chill and games like Borderlands 2. It has to do with how Steam tags the game. Steam games are tag with an ID number. So, when AMD Settings auto scans for games, it picks up the Steam ID number and not path the games .EXE file is at. Why does this matter? I don’t normally use Steam to start the games. I use a short cut to start the game. So, AMD Setting Chill does not think you started the game, hence it doesn’t do it’s magic of limiting frame rate, which then lowers the watts usage and temperature. I start the game using Steam then Chill works like a charm. Sorry, there really is no manual for AMD Settings. I just looked around the internet and trail and error. Hopefully my ranting in this type of forum helps you decide on buying a AMD or a Nvidia.As an update on some metrics after I figured out Chill:Destiny 2. 68-75 FPS, 45-50 Watts, 65-68 C, 1600rpmElite Dangerous: Horizons. Very consent 75FPS. 45-50 Watts, 55-60C, 600-700rpm.Update 18 Jan 2019: The AMD Performance Overlay doesn’t really play well with games. Sometimes the hotkey works and sometimes it doesn’t. However, what I found out is that AMD Software has a great feature called AMD Link. (Privacy Warning: This feature requires data of your graphics card and what you are doing to AMD Servers) . Your graphics card send performance data to the AMD Servers and using your smart device (phone or pad) with the occupying AMD Link App, you can watch different performance metrics while playing a game. The metrics are almost real time (every 1 sec) and it provides me FPS, GPU Utilization, GPU Fan Speed, GPU Temp, and GPU Power. There are other metrics that it can provide like CPU Utilization but there is only so much screen real estate on a iPhone. If you want to keep historical records, there is a capture feature in the app. If you don’t mind the privacy issues, this is a good app.Update 24 Dec 2018: So I had the Vega for about 2 weeks and realize that there is a lot of tweaking to optimize game play. If you like tweeking with the settings then AMD is the way to go. If you want more of a plug and play, you may want to go Nvidia. An example is Destiny 2 (this is the game I am playing the most right now). At default settings, using the performance overlay that comes with AMD, it was using 90% GPU utilization, hitting about 70 degree celsius, the fan speed was about 2000+ rpm and using about 200 watts. After tweaking the settings of the driver (Chill at 57-80 fps, enhance sync on, freesync on) and turning off v-sync in game settings, the Vega was using about 40-50% GPU Utilization, 56-60 degrees celsius, about 57-100 watts and was getting about 65-72 fps (Freesync was making everything smooth. No tearing or stutters). So tweaking the settings allowed the GPU to work less but provide the same amount of FPS. BTW I tried tweaking Borderlands 2 and it does not run very well. I have a feeling that Borderlands 2 was programed around Nvidia. Heck, the opening screen is the Nvidia logo. So it will max the AMD GPU just to get 50-60 fps.I didn’t really need to upgrade my currently GTX 980ti. However, I paired the GTX with a LG Widescreen Monitor, 1080p, with Free-sync. When I initially put together my computer, I was moving from a Mac so I was not really educated on Free sync or G-Sync. What I wanted was a Widescreen Monitor to allow me to have several windows open when writing papers. As with most PC users, I started to gravitating into playing video games. So, I had a choice of upgrading my monitor to a Widescreen with G-Sync or buy a AMD Radeon. G-Sync monitors in general are expensive and Widescreen are even more expensive. Reviews on Vega 64 would say it was comparable to the GTX 1080. However, I would say it was more comparable to the GTX 980ti / GTX 1070. Anyway, long story short, the MSI Vega 64 drop in price to about the GTX 1070 price range for a very short period so I bought one. As a note my first choice was the ASUS Vega 64 but at $750, yea… No!Installation: Not hard. Pulled out the GTX 980ti and plugged in the Vega 64. I didn’t uninstall the Nvidia drivers or Nvidia software prior to installing the Vega 64. Not sure if this is common to all computers, but it took a long time for my computer to boot initially (about 5 mins). This could be because the computer was trying to load the Nvidia drivers or had to install the windows generic video drivers but I was worried because of the long blank screen. I thought I got a DOA card. The next was to download the most current drivers which was Radeon Adrenalin 2019. Then I uninstall all the Nvidia software. Re-boot (which was much faster than the initial) and off I went.Performance wise, it is about in par with my GTX 980ti. (Note: I had V-Sync turn off and did not cap the FPS. Reading discussion on the internet is not very conclusive of if Vsync should be turn on or off. Also, not sure if capping the frame rate to the monitors max refresh should be done) Destiny 2 is a bit slower FPS (72-75), max settings. GTX 980ti consistent 75 FPS Elite Dangerous – the same 75 FPS, max settings Watch Dog – about the same 110 FPS, max settings Borderlands 2 – This was all over the place, 52-106 FPS, max settings. GTX 980ti gave a consistent 75 FPS. However, I think Free-Sync was working because it was still very smooth as the FPS fluctuated. (Note: Borderlands 2 was optimized for Nvidia) Borderlands: The Pre-Sequel – 100-108 FPS, max settings. GTX 980ti gave a consistent 75 FPS. (Also optimized for Nvidia)I have more games but in general the Vega 64 is about on par with my GTX 980ti. Did I need to upgrade? No, hence the 3 stars. The move to Vega 64 was not eye shattering different. However, it satisfied my itch on Free-Sync, the price was right $400, and I got 3 new games (has not been released yet) games ($160 value). Depending on the value you put on the games, the cost of the card drops to about $250. If I had to buy the games, the only one I would consider getting is Division 2. However, Devil May Cry 5 and Resident Evil 2 don’t seem like bad games.Other notes: The one fan blower is somewhat loud. It doesn’t bother me because the HyperX Cloud Alpha headphones does a really good job blocking outside noise. Most gaming session will get the fan spinning at 2200 rpm and hit about 77c temp. I haven’t used Wattman yet to boost performance so my results above is using baseline settings.

    Helpful(0) Unhelpful(0)You have already voted this

    Add a review

    Register New Account
    Reset Password
    Compare items
    • Total (0)