After AMD's keynote at CES 2026, we were invited to a roundtable discussion at the show with AMD in Las Vegas, Nevada, covering their latest graphics technologies, like FSR Redstone and more. Some highlights include insight into why AMD chose not to put RDNA 4 graphics in its AI400 mobile chips, the open sourcing of FSR, and much more.
Before getting stuck into the press Q&A, be sure to check out our previous coverage on AMD's announcements at the show, including Redstone on RDNA 3, hinting at open sourcing FSR 4, and the company's overall announcements they made at their keynote last week.
AMD at CES® 2026 Replay - YouTube
With all of that out of the way, please note that some elements of the transcript have been lightly edited for flow and clarity, and we have also identified certain speakers when they were made known. While recording in a noisy, ambient environment, some things can get lost in the audio mix we've denoted as such in the copy.
Josh Hort (AMD): So here, we're at the FSR Redstone Roundtable; my name is Josh Hort. I'm Senior Director here at AMD in the Computing and Graphics group, so I work underneath Jack Huynh. You've probably heard that name before. I lead ISV enablement. So what does that mean? It means everything from AI PC software enabling, to building that ecosystem up, to all the game engineering and integration of FSR technologies, to benchmark optimizations and benchmark engineering.
So my team is responsible for working with the ULs and Primate Labs of the world to make sure we get the scores that we do on the products that we deliver, and we're also responsible for developer programs for GPUOpen; well, that's the website that my team manages, as well as hardware seeding to developers and other key parties and partners. So that's me in a nutshell. I've been at AMD about seven and a half years; you can find this on LinkedIn. Before that, I worked at Intel for 17.
I don't know how much you guys are aware of Redstone. I can go really fast through this, or I can go really slow. If all you guys know all this stuff, we can just blow through it, because it's really just a recap of what happened on December 10.
Journalist 1: Yeah, I know it. I think we all do.
Other Journalists: [chorus of acknowledgment]
Josh Hort (AMD): If you know what was released; we have over 200 titles that we delivered in 2025, which is — I do have to pause though, because the amount of progress that we made in 2025 is phenomenal. I mean, from the outside looking in, I hope you guys see it the same way, because when we launched at the end of February, we had, what, thirty? Thirty-two titles, 33 or something. I'll tell you internally, I told my team: by Computex, let's have a stretch goal of 100 titles, and they blew that number way out of the park. So it's just fantastic, the amount of reception we've had?
And I think — combined with putting it on GPUOpen, which has really also increased the uptake. Because we can't be everything to everyone. [...] We can't cover the whole sphere of developers that are out there, right? So putting the code out on GPUOpen has also leveraged some uptake that we weren't even anticipating. So it's working, I guess, long story short, right? If you look at FSR 1 / 2 / 3 support, it's 500-plus [games]; Redstone, it's 200-plus. And you guys are aware that Redstone is for the 9000 series cards. Okay, so if there aren't any questions, I can just stop [...], and you guys can ask away. We don't need any slides. Sure.
Journalist 2: Multi-frame generation.
Josh Hort (AMD): Multi-frame? I can't comment on future plans. I can say absolutely we're looking at it. I think in general, we need to get the right things right first, and multi-frame gen, of course, introduces latency. And so how do you combat that? We have technologies like Anti-Lag, but we have to marry those two technologies together so that you can improve the latency the best as possible.
Journalist 3: So I think a more rounded question is, do you do you see a demand for a multi frame gen? I know when I go to play a game, I rarely find myself going above 2x frame gen.
Josh Hort (AMD): It's always in the eye of the beholder, right?
Journalist 3: Sure.
Josh Hort (AMD): Some people are more tolerant to latency. Some games like, Twitch games, FPS games; it's not really appropriate, but if you're playing something casual, usually on the casual side, it also has less demands on the GPU, so you don't need it; your frame rate is already high enough, right? And if you're a casual gamer, you're not trying to get to 240 FPS. So is there a place for it? [noise of uncertainty] Like I said, we're looking at it, but we don't have any product announcements to make at this time.
Josh Hort (AMD): How are you? I'm Josh. Dean? Pleasure.
"Dean": Will I get better at 240 FPS?
Josh Hort (AMD): Will you get better at 240? I dunno. If you've got the golden eye, maybe. eSports games, sure, right?
AMD Representative: For that, you've just gotta get really close to the monitor, and turn it to 240.
Josh Hort (AMD): I know eSports guys that can see to the pixel at 240 Hz. Some of these people, even if they see one pixel change — and they can see that. Okay. I dunno about you guys, but my vision's getting a lot worse. [chuckles]
Journalist 4: So what about support for RDNA 3.5? Because there was an announcement yesterday, the new mobile Ryzen AI 400 series [...] a new stack, but they don't get new [...] performance.
Josh Hort (AMD): I feel like the question's been answered already, so I don't have much — anything more to add. I'd say that we're always evaluating the roadmap, and we have to make the right priority call. [...] I will say in general that we have a lot of products out there, and it's a lot of support. It's not just hey, "I took some leaked source code, and I put it on the internet, and it works." That's not how you make a product.
Journalist 3: I don't want to monopolize this conversation, so if anyone else—
Josh Hort (AMD): No. I mean, you guys, feel free to riff off each other, right? Let's make it a natural conversation.
Journalist 3: I had a question on handhelds. Right now, AMD's hardware is dominant in the handheld space, but there's a—
Josh Hort (AMD): Another player that's out there that's pretty small but substantial.
Journalist 3: Yes.
Josh Hort (AMD): And I love 'em by the way.
Journalist 3: Right now, the feature support is there [...] on the Windows handhelds, but there's kind of this disparate integration; there's features like RSR and FMF to be brought to the forefront to that kind of handheld experience. I'm wondering how you're engaged with OEMs to ensure that those features are brought to the forefront so that's easy for players.
Josh Hort (AMD): It's a different team from mine that works with the OEMs, so I can't really speak from any first-hand knowledge, but nothing would prevent them from bringing the Adrenalin driver to the device, and then getting the analytical-based FMF or upscaling as wel.
Journalist 3: So would you be engaged with Valve on the SteamOS side of things, or Lenovo?
Josh Hort (AMD): Yeah, I can't comment on the specifics, right? But absolutely; our partnership with Valve is deeply important to us, and those kinds of features are important to them as well. But I don't have any announcements, [or] feature sets to talk about.
Journalist 3: Okay. So, without going into announcements, is the interest there from — SteamOS is becoming an increasingly popular destination for handhelds; is there interest in those types of features? Is it something that you're discussing?
Josh Hort (AMD): We're absolutely investigating it with them, yes. But again, I can't speak for Valve, or what their intentions are.
Journalist 3: And this is, I guess, a broader Linux question, because the Adrenalin support in Linux is not really there.
Josh Hort (AMD): Well, RADV has really taken off, and a lot of it is _because_ of Valve. Have you seen the amount of contributions they've made to the driver? It's phenomenal. And that's not to say that the [AMDGPU-PRO] driver has gone away, but there's just fewer releases, and they're very targeted releases. But RADV is meant to be the main open-source driver going forwards.
Journalist 3: I'm aware that OEMs can tap in to any feature within Adrenalin; I'm moreso curious if there's any engagement from AMD on that side, because — of course, I understand that, but when I'm going to pick up a handheld, and there's the Lenovo one, there's the MSI one, and there's the ASUS one, and I want the Ryzen Z2 Extreme, and it doesn't have the features that I'm expecting that are easily accessible, it doesn't create the best user experience.
Josh Hort (AMD): Well, FSR and frame gen are on the ROG Ally X, right? And the ROG Ally.
Journalist 3: ASUS has done a good job with it. Moreso than Lenovo and MSI.
Josh Hort (AMD): I see; so that's more what your question is targeted at. I mean, I can't speak on behalf of them, and what they chose for their products is their decision, but the technology exists and it works, right? The Xbox devices do support the analytical frame-gen and upscaling. And actually you can get really good framerates on some games that are very challenging.
Journalist 5: I hacked Lossless Scaling onto my Steam Deck for that exact reason. Hey man, if I had AFMF on my Steam Deck and I didn't need to do that, that would be great.
[some discussion of Samsung's new 1,040 Hz Odyssey 3D 6K monitor]
Josh Hort (AMD): Well, you gotta remember that means you have to make 1,040 frames per second on the GPU side. Which is going to eat into your performance. Your frametime is sub-millisecond. You're rendering not a lot of geometry at that point.
AMD Representative: If you had the choice, then, would you rather be at 240 or that brand-new 1040 Hz monitor?
Josh Hort (AMD): Personally — this is just my personal — this is not an AMD-based opinion — [...] Everybody's trying to push technology to the limits. I'd much rather get 4x the frametime to get 4x the amount of pixels and quality pixels into it, than running it at some ridiculous framerate. And the ML techniques can improve that, right, like MFG, multi-frame gen? Sure, absolutely. But if you're doing 6x, 8x, 32x frame gen? The latency falls apart. Your mind is very quickly going to realize that—
Journalist 3: Even 4x is pretty rough.
Josh Hort (AMD): Yes.
Journalist 3: So Redstone was kinda this big moment for, for the most part, feature parity with — short of MFG, which I think is a question of value to a lot of people — this level of feature parity and pushing toward what NVIDIA would call Neural Rendering. You're not announcing any products, not doing anything like that, but I'm wondering: are there more applications of AI and machine learning in rendering? And where do you see it having more applications outside of upscaling and frame generation?
Josh Hort (AMD): The easiest place — because when we talk about internally, some people just synonymize it with gaming, and I have to stop them and say "no, time out." In workstation, there's a lot of CAD/CAM applications, that whole segment, from Blender, to Autodesk applications, etc etc; you guys know them all. [Those applications] lend themselves very well to not only upscaling and frame gen, but also things like ray-trace denoising, like ray generation, and even neural radiance caching; the more advanced things we've been bringing towards the ray-tracing [and] path-tracing acceleration. So yeah, the first step is — and actually, we are working with ISVs in bringing FSR to their applications. NVIDIA already does this; the competition already does this, so it's just a natural progression. Did that address your question, or was there another part of it?
Journalist 3: I guess what I'm saying is, for me, when I hear about — obviously, frame generation, and upscaling, those make sense, but then I hear about ray regeneration, about neural radiance cache, and as someone consumes this content, and doesn't make it, these are applications of machine learning that I hadn't realized, and they can make a really significant difference. And so I'm saying, from your perspective, what are those other applications for machine learning in the rendering pipeline?
Josh Hort (AMD): I see; so like "what other things could we do?"
Journalist 3: Yeah.
Josh Hort (AMD): Well, there's certainly ReSTIR, which is reservoir sampling; I'm trying to think of what else there is. There's neural intersection functions, what we call NIF, with importance; basically, when you're doing ray tracing, you're looking for those intersections where the ray intersects with a certain volume, and if you can create a neural method to figure out which rays are important and where they're going to intersect, you can accelerate not having to go through the BVH tree traversal, which is a very expensive thing. So the less BVH traversals you have to do, the faster your ray-tracing and path-tracing is going to be. And really, your path tracing is what, just multi-bounce ray-tracing, right?
So we can actually get to the path-tracing era, because there really aren't hardly any titles out there that support path tracing, because you need a very expensive graphics card in order to run it. So that's what ML is going to bring is really democratize, bring it more to the masses, right? I think Lisa said it best, yesterday — AI everywhere, for everyone, right? We really believe the same thing, and with FSR Redstone, and what's coming next, and so on and so forth is — yeah, we want to bring this to the masses. And it's really the functions and the features that we think are going to be useful, versus just some candy or powerpoint where it doesn't have ISV uptake, developer uptake.
Journalist 5: Will there ever be support for Redstone on RDNA 3?
Josh Hort (AMD): [group laughing] Haha, he asked the same question. [Journalist 4 above]
Journalist 6: I got to play today with Panther Lake with multi-frame generation. It was pretty good for a very sleek and mobile device. Ryzen AI 400 is coming not only to the notebook and [unintelligible] and also for the desktop part, so there will be small desktop PC with Ryzen AI 400, and you get a pretty good experience on a big screen; you have a small tiny box over there, with good frame generation, so, it is ... the way to go, I guess.
AMD Representative: That's kind of a hardware question, right? It's kind of hard for Josh to answer.
Journalist 6: Yes, sure, but —
AMD Representative: Your real question is "why isn't it RDNA 4 in Ryzen AI 400," right?
Journalist 6: No, that's [laughing] no, that's really another question — that's a BIG question.
Josh Hort (AMD): That one was a product decision, right? And I'm not the product decision-maker.
Journalist 6: Ryzen AI 400 is also a refresh of the 300 series, I get that. Maybe in the next version we will see RDNA 4 or something like that. [...] I think the market is big, especially for the notebooks, and the tiny desktops.
Josh Hort (AMD): The demand is out there, is what you're saying.
Journalist 6: Yes.
Journalist 3: I'll tell you what I would like to see. [Journalist 5] mentioned Lossless Scaling ealier; when AFMF first came out, I was expecting there to be an approach by AMD to approach the developer of Lossless Scaling to bring that into the driver, and speaking of multi-frame gen, I think one of the applications for it that would make a lot of sense to me is in the driver, through AFMF, to scale up higher, which is obviously something Lossless Scaling can do to varying degrees of success. That is definitely something that I think could turn [points to small PCs] that, that, to — not a gaming machine necessarily, but —
Josh Hort (AMD): Capable of running something, yeah.
Journalist 3: And obviously you guys have made a lot of strides with ISV engagement, we mentioned FSR3 earlier, but you can't engage everyone, it's not possible, and that driver-level solution for me has been really great.
Josh Hort (AMD): It's definitely something we're investigating closely, how we can bring more of the technology to the driver so it doesn't have to be what I call "enlightened," which means like, in-engine, in-game, versus "unenlightened," where the driver takes care of it. And yes, we are definitely investigating — again, no product plans, no announcements today — but absolutely we're looking at how we can bring some more ML tech to the driver level. It makes it easier to distribute, it gives you backward compatibility with games that will never get updated ever again. The publisher might not even exist anymore.
Journalist 7: [Andrej Zdravkovic, AMD SVP of GPU Technologies] mentioned that the plan was to still open-source FSR4, and I wanted to confirm that with you after, obviously, the source code came out in August I think it was? That's still the plan, to open-source—
Josh Hort (AMD): Each of the technologies will be on their own trajectory for open-sourcing. I don't make the decisions or do the open-sourcing; my team executes it, like puts it on the website. But it's a strategic discussion on what gets open sourced and what doesn't. If Andrej told you we're gonna open source it, then — well —
Journalist 7: [laughing] You won't dispute that claim?
Josh Hort (AMD): Now, [as for] when? I can't say when. Obviously, we try to be as open as we possibly can without giving away the farm. Because we want proliferation of the technology, right? There's partners who come out of the woodwork that could be competitors of ours, who also want to be partners on things like this because it's just good for the ecosystem. FSR was picked up by Microsoft for their AutoSR implementation, for instance, and the main reason why they picked it is because it was open. Open source is good for everybody. [Note: this is incorrect as far as we know; AutoSR is ML-based and uses a custom CNN.]
Journalist 8: You made a comment that we've made a lot of progress with ISVs [...] if your team was in charge of this, was there like a change in strategy, or more of a focus? How did you get from point A to point B, where you got like 200 games in a couple months?
Josh Hort (AMD): I think I kind of hinted at it, right? I think putting it out on GPUOpen, we hit critical mass. I think it was a flywheel where the game publishers saw "OK, this tech actually works pretty well," and the Digital Foundry review came out, they did the pixel peeping and were like, "Wow! This is pretty fantastic." It was a glowing review, which was great for us, right? But it puts us on the map where like, OK, the technology's at a point where it's mature, it's ready to go, and my team put in a lot of effort in the ISV enabling portion, but also like I said, putting it on GPUOpen so that anybody can pick it up I think also really helped propel us way faster than we could do with just the people power that we have.
Journalist 8: Following up on the change in strategies, when you're approaching ISVs for integration, particularly in games, outside of budget and time constraints, do you hear anything else from developers concerning their hesitation for adding features, especially the latest and greatest features?
Josh Hort (AMD): It depends on what piece of technology is getting inserted into the pipeline and where. If you look at things like super resolution, and frame generation, those are both post-processing steps at the end of the pipeline, so it's very easy to integrate them. Now, denoising, the ray regeneration portion, that can be more tricky, because some game engines have a fused denoiser where they want the denoise and the super-res step to happen at the same time; others want them completely separated because they're at different parts of the pipeline.
So when you have a fused denoiser, it provides performance, but it can't be used by all game engines, just because of the way they are written. On something like the neural radiance caching, that one is even more complex, because it needs a lot of different input information into the model to get the output, and the information might not be readily available in the format that's required by the model in order for it operate. It's also kind of intrusive into the pipeline. Long story short, it's harder to integrate.
Journalist 8: So, especially for those features, you're looking to engage developers before release as much as you can?
Josh Hort (AMD): Yeah, so like we did a lot of work with Fatshark, because they can move really fast, and they were embracing the technology; not necessarily "building the plane as we're flying it," but super cutting-edge, rapid iteration, super close, deep technical partnership. The amount of work we did in a small amount of time is phenomenal. But we have more work to do, and that's going to be a focus in 2026, is getting that Neural Radiance Caching feature — not only getting it into more titles, but also improving on the integration, the API and whatnot, so as we get more feedback from our ISV partners, we'll be improving it across the way in 2026 and beyond.
Journalist 1: How's Redstone been with VR? Any issues with ghosting, or?
Josh Hort (AMD): With VR? I haven't tried it personally; I don't have a VR headset — I used to. But in my copious spare time, it was collecting dust, unfortunately. I haven't heard it first-hand of any issues. I would say frame gen in general might be a problem? Because I know from past experience that latency is so critical to avoiding nausea. If your brain can perceive one frame that's behind it immediately gets sick, you're done. Some people don't notice it, but a lot of people do, and that's why they get sick. And if you think about it, frame gen today is, you're taking a past frame and the current frame and inserting another one, so you have to hold a frame. You're naturally introducing that latency?
By default, you've already just voided the benefit there, because you're going to get sick. Now super res, sure! It's a fantastic technology. It's almost a requirement, because you want to get to 240 Hz. Like, he was talking about 1040 Hz, in a VR headset 1040 Hz actually might be pretty cool. High-speed, super realistic — but again, you're cutting your frame time down, your rende time, to one millisecond or something, depending on what your resolution is and whatnot, so you're not going to be able to do a lot of high-quality features.
[Session ends with off-topic VR game discussion.]
.png)
3 hours ago
2








English (US) ·