I heard from 1 of the people that do HW reviews that AMD was considering
implementing their very own RTX and looking at what people think of RTX.
I created a poll on LQ:
https://www.linuxquestions.org/questions/showthread.php?p=5962219 and in the past 23 hours have gotten 5 votes! :(
I am posting to this ML and several other places in hopes of
getting a better turnout.
Thanks for you vote.
Please comment on the actual LQ poll page not this email.
Yes, Ray Tracing eXtensions (RTX) may be relevant to video editing, and
therefore, ffmpeg, at a later date.
As a follow up:
It was brought to my attention that some people think that RTX is
marketing jargon and/or that my poll is an attempt at forcing a
technology upon companies.
Ray Tracing eXtensions is not jargon. It is a large portion of Nvidia's
GPU's die space. You pay for it whether or not you use it. As for
forcing, that was *never* my intent. Nvidia has things, like physX and
hairworks that AMD does not. Likewise the reverse is true for things like
Vega's HBM being used as a last level cache.
-----BEGIN PGP SIGNATURE-----
On Fri, Feb 15, 2019 at 13:43:44 -0500, Ted Park wrote:
> > Yes, Ray Tracing eXtensions (RTX) may be relevant to video editing, and
> > therefore, ffmpeg, at a later date.
> How? This is hard to imagine.
On Fri, 15 Feb 2019 22:37:00 +0100
Moritz Barsnick <[hidden email]> wrote:
> On Fri, Feb 15, 2019 at 13:43:44 -0500, Ted Park wrote:
> > > Yes, Ray Tracing eXtensions (RTX) may be relevant to video editing,
> > > and therefore, ffmpeg, at a later date.
> > How? This is hard to imagine.
> I agree.
> 10 years ago, Intel tried to make a case for ray tracing in games:
> https://techreport.com/news/17641/intel-shows-larrabee-doing-real-time-ray-tracing >
> So, now it finally seems to be coming. I do see the use case.
> I don't see this case for videos, as ffmpeg processes. Even if there is
> a small niche for it, just about *everything* would be different -
> except the output, if rendered to pixel graphics.
> My 2¢, as of today ;-),
If you wanted my theory it goes like this. Currently videos of movies
depict the camera viewing 1 or more of the subjects at 1 particular angle
and 1 particular zoom factor.
Now if you used some tech, say Intel's new "immersive media" using
then you could combine the known position of people and objects, ray
tracing, and what each items color is to allow a video file to be played
from different perspectives, zoom factors, and looking at different
It's a long way off, I know, and at the risk of saying the obvious, the
decisions that are made today are the ones that affect tomorrow.
With AMD's interest in what buyers think of RTX, and the fact that
they have created and released opensource drivers, thus demonstrating
that we are worthy of their time, this gives the Linux community a chance
to finally be included in the discussion! I don't understand the
communities lack of enthusiasm! I'm up to only 9 votes across 6