Intel Arc Pro B50 GPU Launched at $349 for Compact Workstations

72 qwytw 64 9/7/2025, 10:06:35 PM guru3d.com ↗

Comments (64)

tester756 · 3h ago
https://www.phoronix.com/review/intel-arc-pro-b50-linux

>Overall the Intel Arc Pro B50 was at 1.47x the performance of the NVIDIA RTX A1000 with that mix of OpenGL, Vulkan, and OpenCL/Vulkan compute workloads both synthetic and real-world tests. That is just under Intel's own reported Windows figures of the Arc Pro B50 delivering 1.6x the performance of the RTX A1000 for graphics and 1.7x the performance of the A1000 for AI inference. This is all the more impressive when considering the Arc Pro B50 price of $349+ compared to the NVIDIA RTX A1000 at $420+.

wewewedxfgdf · 3h ago
The new CEO of Intel has said that Intel is giving up competing with Nvidia.

Why would you bother with any Intel product with an attitude like that, gives zero confidence in the company. What business is Intel in, if not competing with Nvidia and AMD. Is it giving up competing with AMD too?

SadTrombone · 3h ago
AMD has also often said that they can't compete with Nvidia at the high end, and as the other commenter said: market segments exist. Not everyone needs a 5090. If anything, people are starved for options in the budget/mid-range market, which is where Intel could pick up a solid chunk of market share.
pshirshov · 2h ago
Regardless of what they say, they CAN compete in training and inference, there is literally no alternative to W7900 at the moment. That's 4080 performance with 48Gb VRAM for half of what similar CUDA devices would costs.
grim_io · 55m ago
How good is it though compared to 5090 with 32GB? 5090 has double the memory bandwidth, which is very important for inference.

In many cases where 32GB won't be enough, 48 wouldn't be enough either.

Oh and the 5090 is cheaper.

adgjlsfhk1 · 46m ago
AMD has more FP16 and FP64 flops (but ~1/2 the FP32 flops). Also the AMD is at half the TDP (300 vs 600 W)
grim_io · 29m ago
FP16+ doesn't really matter for local LLM inference, no one can run reasonably big models at FP16. Usually the models are quantized to 8/4 bits, where the 5090 again demolishes the w7900 by having a multiple of max TOPS.
Mistletoe · 2h ago
I’m interested in buying a GPU that costs less than a used car.
grg0 · 3h ago
Zero confidence why? Market segments exist.

I want hardware that I can afford and own, not AI/datacenter crap that is useless to me.

mathnode · 3h ago
Because we don't need data centre hardware to run domestic software.
MangoToupe · 16m ago
I don't really want an nvidia gpu; it's too expensive and I won't use most of it. This actually looks attractive.
ryao · 2h ago
I thought that he said that they gave up at competing with Nvidia at training, not in general. He left the door open to compete on inference. Did he say otherwise more recently?
ocdtrekkie · 3h ago
NVIDIA cards are unironically over $3,500 at the store in some cases...
mixmastamyk · 17m ago
Compact? Looks about a foot long and two slots wide. Not abnormal but not what I’d call compact either.
bitmasher9 · 4h ago
It’s interesting that it uses 4 Display Ports and not a single HDMI.

Is HDMI seen as a “gaming” feature, or is DP seen as a “workstation” interface? Ultimately HDMI is a brand that commands higher royalties than DP, so I suspect this decision was largely chosen to minimize costs. I wonder what percentage of the target audience has HDMI only displays.

Aurornis · 3h ago
DisplayPort is the superior option for monitors. High end gaming monitors will have DisplayPort inputs.

Converting from DisplayPort to HDMI is trivial with a cheap adapter if necessary.

HDMI is mostly used on TVs and older monitors now.

unsnap_biceps · 2h ago
HDMI is still valuable for those of us who use KVMs. Cheap Display port KVMs don't have EDID emulation and expensive Display Port KVMs just don't work (in my experience).
Ardon · 1h ago
The only well-reviewed DisplayPort KVMs I'm aware of are from Level1Techs: https://www.store.level1techs.com/products/kvm

Not cheap though. And also not 100% caveat-free.

godman_8 · 1m ago
I have one and it still sucks. I ordered it after the one I bought on Amazon kind of sucked thinking the L1T would be better and it was worse than the Amazon one.
cjbconnor · 4h ago
4x Mini DP is common for low profile workstation cards, see the Quadro P1000, T1000, Radeon Pro WX 4100, etc.
klodolph · 3h ago
This is the right answer. I see a bunch of people talking about licensing fees for HDMI, but when you’re plugging in 4 monitors it’s really nice to only use one type of cable. If you’re only using one type of cable, it’s gonna be DP.
accrual · 1h ago
Yeah, I recall even old Quadro cards in early Core era hardware often had quad mini DisplayPort.
trueismywork · 3h ago
HDMI also blocked open source driver by AMD.

https://www.theregister.com/2024/03/02/hdmi_blocks_amd_foss/

dale_glass · 4h ago
HDMI requires paying license fees. DP is an open standard.
mananaysiempre · 3h ago
As far as things I care about go, the HDMI Forum’s overt hostility[1] to open-source drivers is the important part, but it would indeed be interesting to know what Intel cared about there.

(Note that some self-described “open” standards are not royalty-free, only RAND-licensed by somebody’s definiton of “R” and “ND”. And some don’t have their text available free of charge, either, let alone have a development process open to all comers. I believe the only thing the phrase “open standard” reliably implies at this point is that access to the text does not require signing an NDA.

DisplayPort in particular is royalty-free—although of course with patents you can never really know—while legal access to the text is gated[2] behind a VESA membership with dues based on the company revenue—I can’t find the official formula, but Wikipedia claims $5k/yr minimum.)

[1] https://hackaday.com/2023/07/11/displayport-a-better-video-i...

[2] https://vesa.org/vesa-standards/

stephen_g · 58m ago
I'm not 100% sure but last time I looked it wasn't openly available anymore - it may still royalty free but when I tried to download the specification the site said you had to be a member of VESA now to download the standard (it is still possible to find earlier versions openly).
amiga-workbench · 52m ago
Because you can actually fit 4 of them without impinging airflow from the heatsink. Mini HDMI is mechanically ass and I've never seen it anywhere but junky Android tablets. DP also isn't proprietary.
KetoManx64 · 3h ago
There are inexpensive ($10ish) converters that do DP > HDMI, but the inverse is much more expensive ($50-100)
jsheard · 3h ago
That's because DP sources can (and nearly always do) support encoding HDMI as a secondary mode, so all you need is a passive adapter. Going the other way requires active conversion.

I assume you have to pay HDMI royalties for DP ports which support the full HDMI spec, but older HDMI versions were supersets of DVI, so you can encode a basic HDMI compatible signal without stepping on their IP.

stephen_g · 55m ago
As long as the port supports it passively (called "DP++ Dual Mode"), if you have a DP-only port then you need an active converter which are the same as the latter pricing you mentioned.
mrinterweb · 3h ago
DisplayPort is the interface most gaming monitors use. If you don't need ARC or CEC, mostly used for home theater builds, DisplayPort is preferable.
hamdingers · 3h ago
Can't fit 4 of anything else in a half height bracket.
glitchc · 4h ago
The latest DP standard has higher bandwidth and can support higher framerates at the same resolution.
StrangeDoctor · 2h ago
There’s also weirdness with the drivers and hdmi, I think around encryption mainly. But if you only have DP and include an adapter, it’s suddenly “not my problem” from the perspective of Intel.

No comments yet

shmerl · 3h ago
DP is perfectly fine for gaming (it's better than HDMI). The only reason HDMI is lingering around is the cartel which profits from patents on it, and manufacturers of TVs which stuff them with HDMI and don't provide DP or USB-C ports.

Otherwise HDMI would have been dead a long time ago.

syntaxing · 3h ago
Kinda bummed that it’s $50 more than originally said. But if it works well, a single slot card that can be powered by the PCIe slot is super valuable. Hoping there will be some affordable prebuilds so I can run some MoE LLM models.
jeffbee · 30m ago
Competing workstation cards like the RTX A2000 also do not need power connectors.
Havoc · 4h ago
Good pricing for 16gb vram. Can see that finding a use for some home servers.
jazzyjackson · 4h ago
Huh, I didn't realize these were just released, I came across it looking for a GPU that had AV1 hardware encoding and been putting a shopping cart together for a mini-ITX xeon server for all my ffmpeg shenanigans.

I like to Buy American when I can but it's hard to find out which fabs various CPUs and GPUs are made in. I read Kingston does some RAM here and Crucial some SSDs. Maybe the silicon is fabbed here but everything I found is "assembled in Taiwan", which made me feel like I should get my dream machine sooner rather than later

bane · 3h ago
You may want to check that your Xeon may already support hardware encoding of AV1 in the iGPU. I saved a bundle building a media server when I realized the iGPU was more than sufficient (and more efficient) than chucking a GPU in the case.

I have a service that runs continuously and reencodes any videos I have into h265 and the iGPU barely even notices it.

jazzyjackson · 3m ago
Looks like Core Ultra is the only chip with integrated Arc GPU with AV1 encode. The Xeon series I was looking at, the 1700 socket so the e2400s, definitely don't have iGPU. (The fact that the motherboard I'm looking at only has VGA is probably a clue xD)

I'll have to consider pros and cons with Ultra chips, thanks for the tip.

jeffbee · 25m ago
What recent Xeon has the iGPU? Didn't they stop including them ~5 years ago?
dangus · 12m ago
I have the answer for you, Intel's GPU chips are on TSMC's process. They are not made in Intel-owned fabs.

There really is no such thing as "buying American" in the computer hardware industry unless you are talking about the designs rather than the assembly. There are also critical parts of the lithography process that depend on US technology, which is why the US is able to enforce certain sanctions (and due to some alliances with other countries that own the other parts of the process).

Personally I think people get way too worked up about being protectionist when it comes to global trade. We all want to buy our own country's products over others but we definitely wouldn't like it if other countries stopped buying our exported products.

When Apple sells an iPhone in China (and they sure buy a lot of them), Apple is making most of the money in that transaction by a large margin, and in turn so are you since your 401k is probably full of Apple stock, and so are the 60+% of Americans who invest in the stock market. A typical iPhone user will give Apple more money in profit from services than the profit from the sale of the actual device. The value is really not in the hardware assembly.

In the case of electronics products like this, almost the entire value add is in the design of the chip and the software that is running on it, which represents all the high-wage work, and a whole lot of that labor in the US.

US citizens really shouldn't envy a job where people are sitting at an electronics bench doing repetitive assembly work for 12 hours a day in a factory wishing we had more of those jobs in our country. They should instead be focused on making high level education more available/affordable so that they stay on top of the economic food chain, where most/all of its citizens are doing high-value work rather than causing education to be expensive and beg foreign manufacturers to open satellite factories to employ our uneducated masses.

I think the current wave of populist protectionist ideology is essentially blaming the wrong causes of declining affordability and increasing inequality for the working class. Essentially, people think that bringing the manufacturing jobs back and reversing globalism will right the ship on income inequality, but the reality is that the reason that equality was so good for Americans m in the mid-century was because the wealthy were taxed heavily, European manufacturing was decimated in WW2, and labor was in high demand.

The above of course is all my opinion on the situation, and a rather long tangent.

Havoc · 4h ago
If you don't need it for AI shenanigans then you're better off with the smaller arcs for under a 100...they can do av1 too
jauntywundrkind · 2h ago
I don't know how big the impact really is, but Intel is pretty far behind on encoder quality mostly. Oh wait, on most codecs they are pretty far behind, but av1 they seem pretty competitive? Neat.

Apologies for the video link. But a recent pretty in depth comparison: https://youtu.be/kkf7q4L5xl8

dev1ycan · 10m ago
I really hope Intel continues with GPUs or the GPU market is doomed until China catches up, Nvidia produces good products with great software, best in industry really, with great length support, but that doesn't excuse them from monopolistic practices. The fact that AMD refuses to compete really makes it look like this entire thing is organized from the top (US government).

This reminds me a lot of the LLM craze and how they wanted to charge so much for simple usage at the start until China released deepseek. Ideally we shouldn't rely on China but do we have a choice? the entire US economy has become reliant on monopolies to keep their insanely high stock prices and profit margins

dale_glass · 4h ago
What about the B60, with the 24GB VRAM?

Also, do these support SR-IOV, as in handing slices of the GPU to virtual machines?

wqaatwt · 4h ago
SR-IOV is allegedly coming in the future (just like the b60).
kev009 · 2h ago
Is there a way to get acceptable performance out of these without resizable BAR now? To retromod older business desktops.
shrubble · 3h ago
With the power being 70W from the connector only, how feasible is it to have 3 per server and have effectively 48GB VRAM for tasks?
adgjlsfhk1 · 2h ago
You're probably better off with the incoming B60 which has 24GB vram.
syntaxing · 2h ago
There’s a 48GB B60 already [1] but it’s definitely a significant amount more than 3X B50 pro (you obviously get way better performance).

[1] https://www.maxsun.com/products/intel-arc-pro-b60-dual-48g-t...

addisonj · 2h ago
I think the answer to that right now is highly workload dependent. From what I have seen, it is improving rapidly, but still very early days for the software stack compared to Nvidia
pshirshov · 2h ago
> 16 GB of GDDR6 VRAM

I would happily buy 96 Gb for $3490, but this makes very little sense.

bn-l · 2h ago
> 224 GB/s of effective bandwidth
jacquesm · 2h ago
How is the software support side for AI work with this card?
ytch · 3h ago
Another advantage of Intel GPU is vGPU SR-IOV, while consumer video cards of NVIDIA and AMD didn't support it. But even the integrated GPU of N100, N97 support it[1],

Therefore I can install Proxmox VE and run multiple VMs, assigning a vGPU to each of them a for video transcoding (IPCam NVR), AI and other applications.

https://github.com/Upinel/PVE-Intel-vGPU

samspenc · 3h ago
Not bad with 16 GB VRAM, a bit disappointing on performance though, looking at the Blender 3D (open source) benchmarks: https://opendata.blender.org/benchmarks/query/?compute_type=...

It clocks in at 1503.4 samples per second, behind the NVidia RTX 2060 (1590.93 samples / sec, released Jan 2019), AMD Radeon RX 6750 XT (1539, May 2022), and Apple M3 Pro GPU 14 cores (1651.85, Oct 2023).

Note that this perf comparison is just ray-tracing rendering, useful for games, but might give some clarity on performance comparisons with its competition.

adgjlsfhk1 · 2h ago
It wouldn't surprise me if there was 10-20% perf improvement in drivers/software for this. Intel's architecture is pretty new and nothing is optimized for it yet.
bee_rider · 2h ago
Is that a comparison of the raytracing fixed function hardware for the various GPUs, or is it a GPGPU comparison?
jaggs · 3h ago
How does it compare to an RTX 5060 ti with 16 gigabytes of VRAM?
mananaysiempre · 4h ago
A $350 “workstation” GPU with 16 GB of VRAM? I... guess, but is that really enough for the kinds of things that would have you looking for workstation-level GPUs in the first place?
adgjlsfhk1 · 3h ago
The closest comparable on the Nvidia side is the RTX Pro 2000 which is 16GB vram for $650 (and likely more compute).
jacquesm · 2h ago
And likely much better support for that compute.