Intel keeps very careful track of its internal engineering samples, going to great lengths to ensure that if they leave the lab it is in pieces so small that they could never be reassembled again. So the first question we need to answer is: how did we get our hands on this thing? eBay obviously. What can't you buy on eBay? A sponsorship on Linus Tech Tips. For that you have to talk to Colton, like Corsair did. Corsair's Dark Core SE RGB wireless mouse features 1ms, 2.4ghz and low latency bluetooth connectivity.
Check it out at the link below. [intro music] So the seller turned out to have been a contractor at Intel a few years ago, who in the middle of a sixth-floor renovation went dumpster diving through the boxes full of "junk" that was destined for the e-waste pile. Apparently It was a treasure trove of press samples, laptops and this GPU-looking thing, that was noteworthy for being blue instead of green or red. You see, by that point, project Larrabee - this, sort of - had been cancelled for years and how many years depends on which cancellation you're going by. So talking to Tom Forsyth, who was one of the key team members, he figures they got cancelled anywhere from four to five times, and remembers getting these weird memos. Yeah... You guys are gonna see some headlines. It's just a thing. None of you were laid off. Just keep on working. BT-dubs you've been rebranded Xeon Phi. Thanks, bye. So what is this thing? That's actually a somewhat complicated question, but because it's got a DVI port, not to mention DisplayPort and HDMI soldered onto it, it is technically an engineering sample board for Intel's first and to date only dedicated graphics card. Now, most people who follow the mainstream tech press believe that project Larrabee was an abject failure but as is so often the case, the truth is actually stranger than fiction. Not only was it a success, but it powered TH2, which was the world's most powerful supercomputer for over two years, and ten years later you can actually still buy its descendants, either in socketed form, as we reviewed just last year, or on Amazon for a cool 1,500 greenbacks. So as it turns out, the goal of the program never was actually to create a gaming GPU. That was just a workload that was already fairly well understood at the time because, you got to remember, back in the mid 2000s the idea of using a GPU as a general-purpose computing unit was just emerging, so this idea of using it for gaming was actually just a small part of a business case to build a processor that had many highly efficient x86 cores that could be easily just, like, slotted into these powerful supercomputers. But that doesn't mean that it couldn't have been used for gaming. In fact, by the time they wound down the units that were working on graphics, they had about 300 of the top selling games on Steam running on the thing, with a card just like this one as the only GPU in the system. And the way this whole thing worked is incredible. Now, a normal graphics card, or GPU rather, uses a lot of fixed function hardware, so if you told it "okay look, I don't need shaders, just draw a ton of tiny lines with really nice anti-aliasing" so it's pretty much CAD, in a nutshell. It would use only a fraction of its hardware. But with Larrabee, everything is software, so the whole chip is lit up doing that. So that actually help to offset the x86 overhead a fair bit. This was the fastest CAD card at the time, and it had other benefits. With regular GPUs, you might run into a situation where enabling a particular feature in a game might hit the AMD users a lot harder than the Nvidia users or vice versa. So during development, AMD and Nvidia, they both have to actually guess as best they can what the next couple of years of games will demand, and then try to look into their crystal ball and build their hardware around that. Larrabee, no such limitation. This thing is a full-blown computer with up to 61 quad-threaded cores running a normal operating system like FreeBSD. Like, you could actually telnet into the thing and run a top command and see a list of all the processes that were running on it, and if you were running a game you'd see, I don't know, 128 or 200 processes called "DirectX graphics", and you could do that while the thing was working! So if you wanted you could cordon off some of the cores and use them for something else, or you could just yolo it and throw another workload into the mix and then just let the processor manage itself. The only non-programmable hardware on this puppy is the texture unit which takes very simple commands. I mean, wrap your brain around this. The thing that I'm looking at right here is Intel's first ever DirectX 11 GPU. Even though it was built before DirectX 11. So this was possible because all of those graphics card features that are normally running in hardware are just running in software, so you could actually update it to DirectX 11 or DirectX 12 With a driver update. Now, there are some caveats here. I mean, there's a reason that the thing never made it into a computer near you. It wasn't as efficient as a dedicated graphics card for a lot of things. So it only got about a quarter of the performance in games as a comparably power consuming card from AMD or Nvidia at the time. But it was really good at certain graphics workloads for a number of reasons and- I mean if you think about it and you look at how far off they were, considering that they were effectively emulating dedicated hardware, it's damn impressive. So... What happened? Well, management happened. Intel at its core, ha-ha, is a hardware company, so they wanted all the features completed so they could either ship this thing or can it. Because in the hardware world making up a four times difference in performance is impossible and you might as well just pull the plug, but the team wanted to work on performance optimization instead, because in the software world it's not unheard of to go from, like, two pixels showing up on a screen and dog slow to a hundred times faster, in a week if you have a breakthrough. And it got to the point where they had to have separate teams for performance and for features, to get management off their backs. So the performance team actually got Quake running, like really fast, but then they found out that Quake was this weird edge case and the architecture would have to be completely redone. I mean to give you that some idea of the dysfunction, at one point there were three to four software teams with different ideas and working on different rendering architectures. But depending who you ask, the continued development would have been worth it. I mean imagine this: instead of turning anti-aliasing on for an entire scene, imagine if a game developer could say "Well, you know what? This sky is not important to be anti-aliased. Why don't we just focus all of our AAA on, you know, these characters here, or this foliage there?" Or how about this, like, "oh crap that texture wasn't loaded. You know what, let's just procedurally generate a placeholder." Boom. Arguably the stupidest decision that was made was to make the Larrabee graphics team and the GEN graphichs team, which is what Intel calls its integrated graphics internally, compete together for the same budget, and then like make internal presentations arguing about why their approach was good in the future and the other groups was bad and not the future, because they were both perfectly suitable for what they were doing. Larrabee was never going to be a 5 watt part that you could fit right into a CPU, and a 200 watt PCI Express part was nowhere on the roadmap for GEN. So, what I've got here (come on, come on, come on) is not "Knights Ferry". That was the first Larrabee revision that had some deal breaking bugs. Apparently the saying in the hardware industry is "always plan to make a prototype, since you'll end up making one anyway". So this is Knights Corner and probably has anywhere from 6 to 16 gigs of RAM, and up to 62 cores depending on how many of them had some manufacturing flaws. Should we fire it up? I mean, come on, I wasn't not gonna do that at this point. I spent like $400 on this thing off of eBay. I've got no drivers for it, so it's actually- this is the first time I've turned it on so it is very possible that it won't manage to display anything even in 2d, But I definitely... have to try. By the way, if anyone out there has the secret sauce drivers or has access to the secret sauce drivers that would make this run games, Please, hit me up. I mean, assuming that it even works, which we don't know yet. I- I actually haven't tried this, I wanted to save the suspense for the video. This is like far more postcodes than I'm accustomed to seeing. But it hasn't stopped. And it hasn't, like, rebooted. We've got some kind of uh... We've got some kind of LED on here. It looks like it stalled on D6, but I don't know what that is. Now, when I talked to Tom, he did specifically mention it's got DVI soldered to it. Now I don't know if that's because DVI was the most relevant output at the time, so that's like what they used internally, or if the display port an HDMI were just dummies and DVI was the only thing that actually worked, so... Take two, I'm gonna run and grab a DVI monitor and gonna try this again. Like I kind of wonder about... You know what, It's PCIe... I mean, would that be even gen two at that point? Like two thousand... 2007? 2009? I wonder about compatibility with a new board and stuff like that. You know what, I don't think it's gonna boot. Well, that's pretty disappointing. I thought I might be onto something with the whole DVI thing. I'm just gonna try- I'm gonna try one other slot, just uh... I think there's only one other one out in the wild and some like Russian collector of like weird hardware has it. Yeah. Not you, a different one. Okay, sometimes it hangs on 79 for a bit and then this thing boots, so that might have been a good sign. Oh no, that's D6 again. I think it's not going anywhere. Well, that was disappointing, but I'm... I'm gonna let it keep trying while I tell you guys about Massdrop. Oh, I'm like sad. It's like hard to have any energy. Okay, we'll try that again. Massdrop! Massdrop is featuring the Sennheiser PC37X Gaming Headset. They've got angled drivers and an open-back design and the drivers actually come from the same family as the HD598 and HD600 headphones. They offer superior stereo imaging and locational accuracy, and come with a noise cancelling microphone. They're available on Massdrop at the link below for a limited time for just $120, so go check them out. So thanks for watching guys. If this video sucked you know what to do, but if it was awesome get subscribed, hit that like button- you can especially hit that like button if you want to make me feel better about how disappointed I feel right now. Uh... Or you can check out the link to where to buy the stuff we featured in the video description. Also linked in the description is our merch store which has cool shirts like this one and our community forum which you should totally join. Oh I really... I was really hoping I was just gonna get the screen to light up. That was all I was really... It was all I really wanted Good night sweet prince. You were too good for this world.
Aaand it doesn't boot, lol
For anyone interested, here is Anandtechβs deep dive into the Larrabee architecture, from almost 10 (!) years ago.
Isn't this really old tech that never got released? I vaguely remember a sister product alongside this that was a coprocessor.
Ooh Larrabee. I have a soft spot for this thing since John Carmack got us all excited for it, really hoped it would arrive and change the market. Alas...
Carmack talking about Larrabee (with Rage) https://www.youtube.com/watch?v=cLLL99S7Uns
https://www.youtube.com/watch?v=hapCuhAs1nA
Footage from the Quake Wars port he did with Intel https://www.youtube.com/watch?v=mtHDSG2wNho
Linus namedrops one of the engineers on it, Tom Forsyth, he's a pretty cool dude, if you don't know him you should check out his various works over the years. Among others, he's been at Intel, Valve, and until recently was one of the Oculus runtime architects.
https://twitter.com/tom_forsyth
http://eelpi.gotdns.org/
I'm still just pissed Intel killed Project Offset with Larrabee, it looked mind blowing for the time. First demos were shown in 2005, two years before Crysis, set your mind to that time period and watch this.
https://www.youtube.com/watch?v=TWNokSt_DjA
If it's a 22nm knights corner as he says it's not even remotely close to Larrabee. Before KNC came the 45nm Knights Ferry which has a much better resemblance but still not even close.
The concept sounds amazing. Being able to run general processing on the thing is a huge game-changer for GPU's. Too bad it got canned.
Darn Linus and his clickbait titles! FYI: this not a recent/future prototype. This is engineering sample from Intel's first attempt at a "GPU": Project Larrabee.
https://en.wikipedia.org/wiki/Larrabee_(microarchitecture))
What I'd suggest is cleaning the contacts on the pci-e connector. I had bought a few old gpus from a computer recycler before that I thought were broken but after I cleaned the contacts, they worked just fine. Well, some of them did, some were actually non-functional.