# Geforce FX5700. Is it time for Radeon to leave your machine? Yes it is.



## Nich

Hi.


As some of you might know, I just recieved my ASUS Geforce and have

plugged it in my HTPC replacing a Radeon9500. In between I had my

Geforce 5600 in the machine, too. Before any of these cards, I used

a Radeon7500.


I'll break this review up in two parts, this being the first(!).

Please keep in mind that this is a raw card, no MP-1 mods or anything.


The projector is a Marquee 8500 Ultra fed by RGBHV. The night's test

disks were: Resident Evil, Pink Floyd: Live at Pompeii, The Fast and the

Furious, and a Danish series called 'Rejseholdet' (Unit One is the

english title).


There is no question about it, this card is way better than the Radeon

9500 I've used for some time now. I saw it right away during Resident

evil, especially in the backgrounds. They were way sharper and much

cleaner. I must admit that I didn't expect a large difference, or none at

all. The scenes inside the complex, in the beginning had razorsharp

backgrounds.


But what really caught my eye, was the color. They were so vivid, I don't

know how to explain this, but it was like there were more of them during

a transition. Like when you use a paint program and choose gradient

fill, and you use 100 colors instead of 70. I hope I can be understood


Then I popped in Live at Pompeii, due to the sunlit photography which

looks awesome. This is where the card really came through. Once again,

amazing color reproduction, the ground in the ambitheater is many

different colors now, from dark dirt to light. It also seem a lot sharper,

at far distance camera work, I could still make out details on persons

and so on. This must count as better backgrounds due to less noise.


The music never sounded better


Yet another thing was the dynamics of the image. It felt like the contrast

range had been increased. I didn't set everything up 100%, but toyed

a little with gamma. Maybe it's just me, but it seems to work a lot better

than on the Radeons. If I cranked it up high, I didn't get the same problems

as with the 9500, greenish image and so on.


My girlfriend's troubled DVD's with her favorite series, didn't have any

improvement though. Seems crap is crap. However, she'll look into them

tonight, maybe her opinion differs. She has spend MANY hours, so she'll

be able to see it right away.


It was during the series that I first saw the problem. The lipsync was off

- way off -. I inserted Roger Waters 'In The Flesh' (PAL) and yes, it

was horrible. I've used 72hz for some time now, without ANY problems

for both PAL and NTSC and HD, but now we had lip sync problems.

Everyones #1 fear. Since it was a PAL disk, I upped powerstrip to 75hz,

and everything was normal again. Go figure.


The Fast and the Furious also looked great, with a lot of background detail,

I've never really seen before. Long shots of the city, masses of people etc.


So, my initial thought is that this is the card for me, and maybe for you too.

I had my eyes on it for gaming, but since it did such an amazing job on the

screen, it'll never see 3D games. Once again, the gamma setting is really

great, which is important for me, the image is tight and razorsharp with

GREAT and VERY vivid colors, by far surpassing the Radeons I've used.


Part II will come tonight, I'll use AVIA and then make the ultimate test:

2001: a space odyssey. White spaceships askew on black backgrounds.

Do you remember seeing Heywood Floyd sitting in front of the craziest

moirÃ© ever? I'll go nitpicking tonight.


If you're looking to buy a new card, consider this one, for sure. Be advised

that you might have lipsync issues.


Thanks for your time,


Nicholas


----------



## pcCinema

I'm going to buy one. So is Asus the way to go? I've compared pics and it appears the asus uses different caps from all the others. Are they better? The others seem to be all smt caps and all appear identical for parts but asus appears to differ a bit at least in the caps.


Any ideas here to help the choosing? I'd even choose based on bundled software differences if there's no real hardware difference for pq, but first choice would be the cleanest video obviously.


I'm sure glad I stuck with the 8500 as long as I did and hadn't already upgraded.


I better order it online. If I go to Fry's I'll walk out with too much other stuff. Still trying to wait on HDD's with no rebate BS.


Troy


----------



## mp20748

Quote:

_Originally posted by pcCinema_
*


I'm going to buy one. So is Asus the way to go? I've compared pics and it appears the asus uses different caps from all the others. Are they better?


Troy*
No, wait on the NVIDIA. It's physically very different from the ASUS 5700. The visual difference is hugh. The NVIDIA has much more (and larger) caps than the ASUS, plus it shows far more components than the ASUS.


So it appears that they both may be using the same processor, but we're not sure that the ASUS has the same "Ultra" feature, or that it's has the same engine that the 5700 is using, because the Nvidia even has a much larger heatsink.


There's no way of looking at the two cards and saying that they are the same. So what would be the difference.


What we are testing with the 5700 Ultra could be a feature that is not in the ASUS, and may not benefit from the DVD software that is being beta tested on the HTPC forum, and offered by Nvidia only.


----------



## JeffY

Quote:

_Originally posted by mp20748_
*No, wait on the NVIDIA. It's physically very different from the ASUS 5700. The visual difference is hugh. The NVIDIA has much more (and larger) caps than the ASUS, plus it shows far more components than the ASUS.

*
Did I miss something here, I didn't think NVidia made cards?


----------



## JeffY

What about this one? http://www.bjorn3d.com/Material/Images/399_P1010035.JPG


----------



## Paul Butler

The MSI card certainly seems to be missing the same chips and has a similar heatsink layout to the BFG card.


Could be that MSI source their cards from BFG with only minimal changes (colour of the board being the most obvious).

Paul


----------



## mp20748

The card that I have has NVIDIA on it only. I have no idea who made it, but do know that there are several card manufacturers who use the same chip set. My point is, all of fhese cards may not be the same in programming and such as the Nvidia card is. Nvidia is the only one that I know of that has designed a software DVD player for their cards that is in beta testing on the HTPC forum. As with ATI, just because the chipset is the same, that does not mean that the soltware will work on every card that has the chipset.


I had a Nvidia card sent to me for testing from a software designer. I'm also having three Nvidia cards sent to me directly from someone at Nvida for testing. One thing I know from playing around with the cards, and that is that you cannot assume that they're all the same in performance, mainly because each board can have different components as well as the Chip set can be externally flashed by the manufacturer of the card to perform precise functions.


----------



## Paul Butler

Ah, that explains it. I think Mike has a "reference" design card and not a retail card as such although retail cards from the likes of Creative, BFG, etc may be identical there is no guarantee. See image attached.


I'm fairly sure that Creative did use reference card designs in their earlier GeForce card releases but not sure about nowadays?


Paul


----------



## mp20748

 http://www.avsforum.com/avs-vb/showt...ht=Nvidia+Beta


----------



## mark haflich

Personally, as one definitely in the dark, I think you all should wait for the holy grail. 


God what gibberish to a computer illiterate like me.


Clearly, MP has some prototype NVIDIA cards, i.e., versions of the NVIDIA card not yet on the market. Could these be improved for CRT use by the MP mod?


----------



## WanMan

Mark, have you read any of Dan Brown? You need to go read the The Da Vinci Code and then re-read your post! Nich, great feedback, but I am a little more than just curious as to the origins of the card he is beta testing.


Mike, was the beta-teser program short-term (meaning its over with for applicants)? BTW, all of you are going to keep me in trouble with the little women in my life.


----------



## Prometheusbound

I believe that Nichs' card is a non ultra FX5700. Is that correct Nich? This may explain the difference in components.


----------



## Nich

Yes, you're right. Non-ultra.



Nicholas


----------



## Phil Smith

I've been looking around and it does not appear to me NVIDIA actually manufacturers cards. That's why I ask which one to buy in the other thread. Am I missing something?


PS: BFG Technology (Asylum) is what Vern is using and Terry is testing.


----------



## Nima

 http://www.digit-life.com/articles2/gffx/gffx-35.html


----------



## THECLOSER

WanMan,


the beta tester program can be entered at the HTPC forum, but there is a qualifier , only the folks who have purchased the previous card and have proof of purchased can apply for the beta program from Nvidia.


----------



## WanMan

Does the beta program application guarantee entrance to the beta program? And when you say 'previous card' do you mean the previous generation card (what would that be)?


----------



## GScott

Quote:

_Originally posted by WanMan_
*Does the beta program application guarantee entrance to the beta program? And when you say 'previous card' do you mean the previous generation card (what would that be)?*
I think you had to purchase the earlier version of the DVD software. From what I read the beta program is closed and the are getting close to shipping the new software. I'm a little worried because several times someone asked how the useability and interface compared to Theatertek and the question was conveniently never answered.


----------



## WanMan

Oh well. Maybe the email santa will bless me. {hint, hint, wink, wink}


----------



## Vern Dias

Quote:

I'm a little worried because several times someone asked how the useability and interface compared to Theatertek and the question was conveniently never answered.
Actually, I'm pretty sure it was answered previously, but I'll answer it here (from an end user perspective, I am not affililated to Nvidia). The NVidia player should be roughly compared to WinDVD or PowerDVD in terms of user interface.


It is NOT a direct TT replacement. That said, it's strengths are in the quality of the Directshow audio and video decoders, and it's unique DCDI-like 3-2 pulldown module which does NOT depend on the DVD being properly flagged.


When combined with Zoom Player Pro, I feel that the combination stomps all over ANY other player in the market.


Just my $.02.


Vern


----------



## GScott

Thanks Vern. Hopefully either Andrew will license the decoders or the new player will pass the Wife Useability Test.


----------



## Nich

Hi Guys.


We finally sat down, my girlfriend and me, and spend some time watching.

She put her series on, and was convinced right away: The backgrounds

and colors were WAY better. She noticed forests behind behind people,

the color of the leaves and horizons. She was very thrilled about this level

of video quality.


There really is something great about this card. I can't recommend it

enough. I popped in Resident Evil again, and I'm amazed at the detail

and lack of errors. It's the best image I've seen so far.


The testing ends now, I've made my choice.


Thanks,


Nicholas


----------



## Nima

Nicholas did you use scoped settings or did you leave everything at default ?


Nima


----------



## Bill Gaw2

Nich: Do you have themodel numbers, name, etc. on the card?


Bill


----------



## Nich

Bill, here's the link to the ASUS page.

http://uk.asus.com/products/vga/v9570td/overview.htm 


Nicholas


----------



## WanMan

Nicholas, which DVD's was the wife/girlfriend watching to notice a difference? BTW, the lipsync issue seems to be one in which one would think a company like Asus would take advantage of and get together with M-Audio to create a paired solution without this problem. Of course, creative could do the same completely in-house but these people seem to suffer from pump-head.


----------



## Nich

EVERYBODY PLEASE READ THIS:


We were watching a movie last night, and the audio would go away

every five minutes or so. I didn't know what to do, so I checked behind

the HTPC and it seems that I didn't re-connect the COAX properly the first

time...sooo, that MIGHT have something to do with the lipsync. I'll try 72hz

later again, just to be sure....


WanMan, my girlfriend 'noticed' it on every disk.


Nich


----------



## cpurvis

Just to clarify - NVidia does NOT make retail graphics cards - they make the graphics chip which powers the card and sell them to OEM's who make cards which are sold at retail. In this way NVidia is analagous to Intel - they both make CPU's (graphics-oriented CPU's in the case of NVidia as opposed to general purpose chips from Intel).


MP's card is an NVidia reference board. These are not available to the general public - they are manufactured by NVidia in relatively large numbers but are strictly for other hardware or software developers (like me) who need to verify the video drivers work properly with the other hardware and / or software they need to interface with. (ATI, on the other hand, sells their reference designs under their own retail name directly to the consumer, as well as providing raw chips for other OEM's to build cards around).


Usually the manufacturers tend to stick pretty closely to the reference board design, but not always. Tom's Hardware is a good place to read detailed reviews and comparasons of the cards made by different manufacturers which are based on the same NVidia core. They generally do a good job pointing out the physical differences and attributes of each card. There's a lot of information there which probably won't make sense to most people, and I wouldn't pay any attention to the benchmark scores of the various cards since none of that really applies to using it for dvd playback within a HTPC. However - there's always the possibility that NO manufacturer will reproduce a board with the same quality of components as the reference cards from NVidia.


----------



## cpurvis

Another thing about Ultra and non-Ultra cards from NVidia:


Unitl recently, the difference between these two types of card in HTPC parlance was zero. Ultra just signifies that the card's CPU is run at a higher clock speed which is benificial for gamers who actually care about the difference between 95 and 105 f.p.s. at 1600x1200 with all 3D effects enabled. The reason the Ultra cards look more "burly" is because the higher clock speed requires increased heat dissapation (more heat sinks) and often also requires higher-grade memory chips which can handle the higher address rate. What does any of that mean to watching a dvd? Zero.


Well, maybe not zero. Now that people are starting to use VMR9 instead of the old video layer some aspects of the 3D hardware are getting used on the cards when watching dvd. However I can almost guarentee that the difference in processing speed of an Ultra vs. non-Ultra 5700 is not going to matter for dvd playback - really we are using maybe 1/1000th of the processing power of these cards when we play back dvd on them in a HTPC. Using Direct3D VMR9 filters will increase the load but not enough, I'm sure, to matter. That said, there may be other more robust components on some Ultra cards which lend themselves to a higher signal to noise ratio on the card and thus better image performance. But that would depend on the specific design of the card itself (see above post) not just because it was an Ultra. In other words, a plain 5700 from manufacturer A might very possibly look better with dvd video than a 5700 Ultra from manufacturer B. Or they might also be identical.


I guess what I am saying is no one should assume that an Ultra is better for HTPC purposes until it's been proven in a back-to-back test, and then it is going to matter who made the card most likely. Also remember that these card's primary purpose is for playing real-time rendered 3D games, and 99.9% of the specifications and marketing hype about them is oriented toward that, NOT for watching dvd's on your projector.


----------



## Vern Dias

cpurvis,


agreed, but what happens when you fire up the WMV Coral Reef Adventure in 1024P? Does the ultra have an advantage now?


I don't have the answer to this, but I can tell you that my 5950 Ultra plays CRA in 1024P to perfection. Not a stutter, not a glitch.


3.2G P4 processor, by the way.


Vern


Vern


----------



## WanMan

Vern, what is the bitrate on that CRA piece?


----------



## JeffY

The build quality of the card tends to be much better with the more expensive (faster) cards.


----------



## WanMan

Jeff, are you saying the the actual PCB and component-on-PCB quality is actually different within a given manufacturer's assembly process?


----------



## JeffY

Yes, that is exactly what I'm saying.


----------



## Nima

Here a quote from the test I posted on page 1:


"All the cards are based on the reference design. AOpen put in the box the very reference card. Only InnoVision changed the design a little bit". 


Does that mean they use all the components Nvidio suggested?


Nima


----------



## cpurvis

Quote:

agreed, but what happens when you fire up the WMV Coral Reef Adventure in 1024P? Does the ultra have an advantage now?
Honestly? I doubt that it does - Unless these new cards have hardware corona decompression (do they yet? - I don't think they do) then the card isn't really doing all that much work.


Yes, the ultra is faster - but for what we are using it for it's like the difference between using 5 locomotives or 6 locomotives to pull one empty boxcar. In the real world, there's not a lot of difference between the two.


----------



## Bart Tichelman

I am not an expert on these items, but I read a comment elsewhere (from either Mike Parker or Andrew Chilvers of TheaterTek) that only the 5700 has the latest MPEG decoder, which does have an impact in DVD playback.


----------



## Vern Dias

Wanman:


I don't know the bitrate of CRA in 1024P, but I do know the my 3.2 gig P4 runs at about 50% utilization playing it.


Vern


----------



## WanMan

Vern, I wonder with which P4 processor that CPU utilization reached 80-90% running the same piece of media.


----------



## JeffY

I'm trying out an MSI 5700 Ultra, always happy to try new things 


Colour, sharpness and clean, all look good. DXVA mpeg decoding with Theatertek is shockingly bad (compression artefacts everywhere), lets hope the card has a few tricks that the Sonic decoders can't access (very possible). Stutter is very pronounced, might be fixable. The MSI card is very quiet (as it says on the box) and runs very cool. I guess I'll have to wait for the new Nvidia player to see if its a goer.


----------



## jcmccorm

Well, that's bad news. I was hoping to try my new 5700 this weekend with TT. I guess not!


Cary


----------



## Mikedd

Quote:

_Originally posted by JeffY_
*I'm trying out an MSI 5700 Ultra, always happy to try new things 


Colour, sharpness and clean, all look good. DXVA mpeg decoding with Theatertek is shockingly bad (compression artefacts everywhere), lets hope the card has a few tricks that the Sonic decoders can't access (very possible). Stutter is very pronounced, might be fixable. The MSI card is very quiet (as it says on the box) and runs very cool. I guess I'll have to wait for the new Nvidia player to see if its a goer.*
I started a thread about the same thing a few days ago. I was using an ASUS 5700 with WINDVD Platinum and HW Overlay. I had real issues with crushed blacks and whites. Its not my system as I compared 4 cards that day. I am going to give it another try this weekend and see what gives. I had some microstutter as well. Couldn'y get the timings correct as I Powerstrip would not work with it. My ATI 9600 is back in but the high frequency peaking is just pissing me off to no end.


Mike


----------



## Nich

Strange that you all have these problems.


I love that card.


Nich


----------



## JeffY

Has anyone tried the newer unreleased drivers yet? I can find 56.56.


----------



## JeffY

Quote:

_Originally posted by Nich_
*Strange that you all have these problems.


I love that card.


Nich*
With all due respect Nich, I'm not sure you know what to look for.


----------



## cpurvis

Quote:

Couldn'y get the timings correct as I Powerstrip would not work with it
You need to make sure you've downloaded a pretty recent version of powerstrip for it to work with a GeForce FX. The current version is 3.49 although I think 3.47 might be okay too.


----------



## Vern Dias

You need the latest Powerstrip to support the newer NVidia cards. I am running 3.48 Build 431.


Jeffy I'm guessing that if you are seeing artifacts then you didn't recalibrate the white/black/gamma levels on your player. The overlay/VMR9 color controls are way different for the Nvidia compared to the ATI.


Vern


----------



## jcmccorm

JeffY, I think Nich is referring to the 5700 card. (not too hard to spot stuttering problems)


Cary


----------



## Vern Dias

Quote:

With all due respect Nich, I'm not sure you know what to look for.
Well, I sure know what to look for (on a 12' screen from a 15' viewing distance, no less) and I don't have any of these issues. (However, I am using a P4, and it has been proven that players use different code paths for different processor optimizations, so I can't speek for AMD users.)


I am also running a 5950 Ultra.


Vern


----------



## JeffY

Quote:

_Originally posted by Vern Dias_
*

Jeffy I'm guessing that if you are seeing artifacts then you didn't recalibrate the white/black levels on your player. The overlay/VMR9 color controls are way different for the Nvidia compared to the ATI.

*
Actually I did, but what I'm seeing is bad mpeg compression artefacts. The sort of thing you see with really poor mpeg decoders (like the sofware only ones).


----------



## JeffY

Quote:

_Originally posted by Vern Dias_
*on a 12' screen from a 15' viewing distance, no less*
It's not the size, it's what you do with it that counts!  Anyway I have bigger tubes.


----------



## WanMan

Quote:

_Originally posted by JeffY_
*With all due respect Nich, I'm not sure you know what to look for.*
Now that's a nice comment. First, you suggest Nich may not know what to look for, and second you don't even hint at what he should be looking for.


Mean teacher!


----------



## JeffY

Yes I know, I felt bad when I wrote it.


You need to look for blocking artefacts, banding and noise that looks like its pulsating. You wont see this stuff on reference DVD's which is what your probably looking at since you want to see how good it is, right? When evaluating mpeg decoding capabilities you need to look at bad DVDs. Then it's a problem with the source I hear you cry, yes it is, but its what it does with the bad stuff that makes one decoder better than another.


----------



## Vern Dias

JeffY, I know what your problem is:


Your photons and electrons are driving on the wrong side of the street! 


Seriously, I am not seeing anything that you are describing here. Of course I am using the Nvidia FWMM filters and ZP Pro with my 5950 Ultra.


Vern


----------



## Steve Smith

Quote:

_Originally posted by JeffY_
*Yes I know, I felt bad when I wrote it.


You need to look for blocking artefacts, banding and noise that looks like its pulsating. You wont see this stuff on reference DVD's which is what your probably looking at since you want to see how good it is, right? When evaluating mpeg decoding capabilities you need to look at bad DVDs. Then it's a problem with the source I hear you cry, yes it is, but its what it does with the bad stuff that makes one decoder better than another.*
Jeff,


I haven't seen anything like this either. Can you give us some examples so we can check those specific scenes. I installed a 5700u the other day and so far I'm pretty impressed with it. Definitely an improvement over my Radeon 9800pro. Right now I'm just using overlay with Theatertek and impatiently waiting for FWMM to be released so I can see what VMR9 is all about.


What drivers is everyone using?


Steve


----------



## Vern Dias

I just installed 56.56 tonight. No problemo.


Vern


----------



## Nich

Jeffy, are you using your magic eyes on a LCD pj?


Please explain MPEG blocking, I don't know what it is.





Nicholas


----------



## JeffY

I'm using a colour corrected Barco 1209s, not that it matters, these artefacts are very easy to see.


The test footage I use is Star Trek Nemisis R2 Chapter 8, this is a dark scene that has a lot of compression artfefacts, look for blockyness in the walls, these won't be stable, they will pulsate on and off, the walls will look like they are moving. I haven't seen the R1 version of this DVD so I can't say if it has the same issuesâ€¦ A much easier one for you to try would be Monsters inc (R1) (everyone has that), look for the scene where they are in the toilets, look at the toilet doors, are they smooth with a good graduation in colour or do they have blocks of colour on them? Note I havn't tested MI yet, but I'm fairly confident it would show this problem.


----------



## Nich

I don't even have Monster's inc....


I haven't seen any errors at all. Could you point out some other disks?


I will try later none the less. We will watch a couple of flicks the next

couple of days. But I doubt that I'll find anything at all.


I'm using a Marquee 8500 Ultra.


Nich


----------



## WanMan

JeffY, that's an interesting test sample for the artifact you are describing. Is there another DVD we could test, possably in R1? March 13th, HBO or Showtime will be showing the R1 of Star Trek Nemisis in HD. Can you describe the scene in the movie so I will know what to look out for?


----------



## JeffY

Its the scene where Pickard and co meet Preator Shinzon for the first time.


----------



## WanMan

Ok, I'll look out for it. I typically look for weaknesses in the presented images with gradients of dark colors (especially shadows). If I can find scenes with dark cloud-like (smooth gradients) structures I tend to grab the material and use it as a demo piece.


----------



## JeffY

Walls that have a plain texture tend to be worst affected. The correct term for the artefact is "I-frame pulsing", if you do a google search it should come up with a few DVD examples as well as explanations of what it is and what causes it. Panasonic DVD players do a pretty good good of clearing these up (as well as Radeon 9500 cards and above in DXVA mode)


----------



## stylinlp

whew. How confusing. I've been reading this forum for 2 years now and still lost lol.


I have a Radeon 9200se and was told it has 12bit vide DAC so thats as good as its going to get. Using THeater Tek with DXVA enabled (what ever that means)

I was thinking of upgrading to a Radeon 9600se but now I don't know.


----------



## RoBro

stynlinlp,

wrong thread?

Roland


----------



## Mark_A_W

The problems with DXVA mode are that reclock doesn't work properly, and neither does ffdshow. Ffdshow may be a matter of taste, but reclock is essential.


If there are some minor improvements with DXVA, I've never noticed them- and god knows I looked critically a DVD playback over the last few years.


My setup is not that bad, and neither are my eyes, but maybe my 9000pro is not up to scratch anymore...


I do use DXVA for HDTV viewing though.


----------



## Vern Dias

OK, Jeffy, I do see them on the stalls in Monsters Inc. I have reported the issue to the FWMM team, who has direct access to the driver team.


However, you really do not want to use DXVA with the FWMM player, at least, because the intelligent 3-2 pulldown postprocessor does an excellent job of handling de-interlacing especially on mis-flagged discs.


Mark also makes the point of losing all ffdshow functions and reclock's VSync function in DXVA mode.


Using ANY kind of video processing forces all the DVD players into software mode.


Both WinDVD 5 and FWMM look better in SW mode on both my Radeon 9700 Pro and NVidia 5950 Ultra anyhow. My processer, (3.2G P4) runs at about 10 - 15% running ffdshow DScaler sharpen, so CPU utilization should not be a factor, either.


Vern


----------



## WanMan

Vern, I guess its a good thing I have no idea what DXVA is.


----------



## jcmccorm

So is it safe to say that if you're a TheaterTek user, it's not necessarily time for the Radeon to leave my machine?


Cary


----------



## stylinlp

Vern. So thats my problem! I bet my THeater Tek is in software mode since I tried using FFDshow a few months back and uninstalled. It enver worked out for me. Didn't like what it did.

Only Have a AMD 1.0 with 256 megs ram, Radeon 9200se


----------



## JeffY

Trying to do video/film detection usually leads to more artefacts, you are much better off being able to select manually (on the fly) as you can with Theatertek. No external scalers offer this functionality, its impossible to force film mode. All I look for is the best artefact free picture I can get. So far that's TT in DXVA mode with a Radeon 9500 or greater. The moment something better comes along, I'll be there in a flash. Apart from the stutter and the mpeg artefacts the Nvidia looked really good so I'm still hopefull.


----------



## Greg_R

How is the quality of the Geforce on video (not film) sources? Watching something like Sessions on West 54th street really shows the shortcomings of the ATI video deinterlacer (horizontal guitar strings, mesh on microphones, etc.). Is the new GeForce any better?


----------



## Vern Dias

Quote:

Trying to do video/film detection usually leads to more artefacts, you are much better off being able to select manually (on the fly) as you can with Theatertek
Not if it's done correctly. Faroujda DCDI being an example.


And Nvidia is doing it correctly. I have some DVD's that were unplayable on any other software player because of improper flagging of the disc (for example, Short Circuit from Image), that play flawlessly on the Nvidia player.


If you really feel that you can switch modes faster than the software can on a mixed mode disc, feel free to do so.


For those who don't react in microseconds, but want to force a specific mode, Nvidia allows 3 options, automatic, manual video, and manual film.


If you are a real minimalist, you don't have to use the post processor at all.


BTW, Nvidia has acknowledged the I-Frame Pulsing issue in DXVA mode and is looking into it.


WANMAN: DirectXVideoAcceleration


Vern


----------



## CaspianM

how do you enable DXVA in power DVD?


----------



## JeffY

Quote:

_Originally posted by Vern Dias_
*If you really feel that you can switch modes faster than the software can on a mixed mode disc, feel free to do so.


For those who don't react in microseconds, but want to force a specific mode, Nvidia allows 3 options, automatic, manual video, and manual film.

*
Actually this is a misconception, if you know its film you set it the once and your done. DCDI and any other processors will show more artfects because it will get confused and switch to video mode when it doesn't need to. Yes if a DVD really has film and video source mixed together then yes forcing film mode will show weave artefacts, but this is actually extremely rare, so rare that I can't think of a single DVD that does this during the main presentation. Of course menus and extras do tend to be a mixture of video and film, this again is where TT is great because you can simply start with auto mode on. Once the film starts, if its NTSC and film it will get detected as such and all will be OK, if its PAL sometimes it has a bit of trouble, in which case it will be immediately obvious and you can simply press the N key to force film. During a recent event we had a number of external scalers including the HD Leeza, Lumagen and others, they all struggled with Titanic (PAL) showing a variety of deinterlacing artefacts, TT in film mode was perfect. External scalers can't be set to force film mode, only video mode or auto.


PS I have an H3DII and an SDI DVD player so I'm familiar with it's deinterlacing capabilities and yes DCDI is a liability when it comes to film.


----------



## JeffY

Vern,


How does Monsters Inc look in software mode with the NVidia player?, does the problem still show up? I'm not a DXVA fan as such, if they can fix it in the software decoders then there wouldn't be any need for DXVA.


----------



## mp20748

Ok, as mentioned earlier. I've had a couple of 5700 Ultra's sent to me to evaluate. One card will go into a second HTPC that I'm having built. The other card will be modded and sent to Vern Dias. This testing will envolve both Vern and myself.


My focus will be more on the performance of the video section of the card. I will look at the cards noise level, bandwidth handeling, cleaness of the video signal and basically how well it deinterlaces. I'll be using Avia Pro, and a few other DVD software programs. I'll even try to post photos of scoped measurements.


Verns testing will cover the performance of the card with the Nvidia software player, and other features as it relates to the cards performance under various HTPC applications.


We'll both be using the Nvidia DVD player. We'll both have identical 5700 Ultras/MP-1's


I will get a modded 5700 Ultra off to Vern first part of next week. We'll need a few days to put this process in action, so I'm thinking that the first test posting to be somewhere around the end of next week. My goal would be next Saturday.


My part of the testing may or may not start by next weekend, as I'm still not situated with my shop at this time, therfore I'll be somewhat busy moving about for a better working environmemt. So you may have to be patient with my results. But be assured that when I do post it will be of substance...


----------



## VideoGrabber

JeffY commented:

> _No external scalers offer this functionality, its impossible to force film mode._ _External scalers can't be set to force film mode, only video mode or auto._


----------



## JeffY

May I suggest that film mode in Theatertek which is a force weave, is not the same as film mode, which can mean several different things, on an external scaler. Forcing video for film based DVDs is quite frankly laughable, you are simply loosing too much resolution by doing so.


----------



## RoBro

Forcing video on scalers is not like forcing BOB.

If you force video, the deinterlacer searches for areas that have no motion between the fields and weaves them. if it detects motion it either gives this area a bob, interpolation, moves the area a bit and weaves then.... depending on what chipset you use. Philips chips do a nice job in motion detection and compensation. They seem to be the only ones to get a moving text on interlaced material to be almost perfect in the progressive output. only problem there seems if you have motion in different directions in ajacent areas.

If there is film material they just do a weave, but if the flags of the fields are bad, they can compensate for this in video mode...

Roland


----------



## JeffY

I have an H3DII so know what to expect with both film mode and video mode with an external scaler. By the way, video mode on TT doesn't force BOB either, it forces adaptive deinterlacing (if supported by the video card)


----------



## Daphoid

I'll stick with my 256MB Radeon thanks 


My Dual Xeon HTPC loves it.


- D


----------



## Vern Dias

Quote:

How does Monsters Inc look in software mode with the NVidia player
Excellent, no issues in software mode.


Vern


----------



## JeffY

Cool


----------



## Mikedd

Where to get the Nvidia player? I just put my Asus card back in for another round. Asus gives you Power DVD in the bundle. Can it be purchased somewhere? Is there an evaluation copy??


Thanks


----------



## JeffY

Hi Bob,


I'm looking forward to putting FWMM through its paces, R1 film is dead easy to detect, not much of a challenge. PAL is another story altogether, it will be interesting to see what FWMM makes of it.


----------



## JeffY

I've been testing the 5700 ultra some more, I can't get it to do a smooth 50Hz, not even with my H3DII card which should be easy. Anyone else tried 50Hz PAL?


----------



## mp20748

OK, my HTPC with the 5700 Ultra should be finished this week. And I'm hoping to perform some testing this weekend.


If all goes well, we'll be comparing my HTPC with the 5700 to a well tweaked HTPC with a HTPC w/ATI 9800Pro. The display device would be a Sony G90.


Besides a few DVD's, we'll be using only two different test pattern softwares. Dispaymate and AVIA Pro.


Displaymate, for before DVD software player. Avia Pro for everything else.


With latest innovative test pattern features in the Avia Pro, we'll be able to closely look at how well a device will handle 'motion'. Now how this relates to deinterlacing I'm not sure, as I'm not familiar with any of the deinterlacing lingo, I only focus on if it works. I gave up on keeping up with the lingo when at a gathering, I got to see a person give praise for how well the device handled motion, but did not notice how terrible the quailty of the image was. I was amazed by this, so I've since made my focus on how well the device handled motion - while being able to also show a clean image with very good sharpness and detail. In my opinion the two should go together.


The ability to view both of these at one time, to my knowlede was not available on any test pattern software before Avia Pro. Now with Avia Pro. The ability to see just how much the motion correcting feature is effecting the quality of the image is now an easy process, mainly because any form of motion correcting can/will effect the final quality of the images detail and sharpness.


The issue shouldn't be if something has motion correction and that it works well, it should be how well it works without effecting the quailty of the displayed image.


Just my twisted perspective.


----------



## HK-Steve

Mike,


Is the 9800Pro going to be a bog standard card or a MP-1 mod card?


What are the other specs of your HTPC, CPU, Ram, DVD Drive and Motherboard.


Look forward to hearing the results.


Cheers

Steve


----------



## tixin

ttt.


Anxiously awating MP test result on the FX5700.


----------



## THECLOSER

Steve,

the 980 pro that Mike P. will be testing is a MP-1 moded card. I am also waiting to here the results because i have seen the set up he is going to be comparing and its awsome.


----------



## mp20748

My HTPC is still being built, so it was not ready for the scheduled testing.


I hope to have this done and completed before the end of next week.


Again, my HTPC will have a 5700 Ultra/MP-1, and the HTPC at the testing site has an ATI 9800Pro/MP-1 - The display is a well tweaked G90.


The testing will be performed by AVIA Pro, assisted by myself and Deniz.


----------



## ChrisWiggles

Thanks Mike, and stay well, I think we're all very interested in the results. All my support!


----------



## mbrandt

Just wanted to bump this thread up. I'm sure I'm not the only one who's interested in finding out how the 5700 Ultra/MP-1 fares against the 9800Pro/MP-1... Thanks for all your work on this.


- Mark


----------



## mikecazzx

Quote:

_Originally posted by Nich_
*I don't even have Monster's inc....


I haven't seen any errors at all. Could you point out some other disks?


I will try later none the less. We will watch a couple of flicks the next

couple of days. But I doubt that I'll find anything at all.


I'm using a Marquee 8500 Ultra.


Nich*
Does the 5700 card work with Theatertek?

If so, what needs to be changed?

What card is Mike going to be testing? What exact brand?

I have Radeon 7200 64 DDR with TT now and no stuttering issues.


Should I go Radeon 9600 or this 5700 Nvidia?


Power DVD and Win DVD are not an option - they look horrible compared to Theatertek.


----------



## stylinlp

I am asking the same question as Mike. I have a Radeon 9200se and want a better picture by buying another videocard unmodified for now.


There was another thead showing how better the Radeon 9500 and better looked better than earlier models. So ive decided to upgrade.


----------



## PJ Adams

Same unmodified video card question from me as well...


Radeon non-pro fan-less 9600 or NVidia FX 5700 Non-Ultra?


----------



## mbrandt

From what I understand, Mike may only be testing out the modded cards this weekend. I think it was TheaterTek that actually sent Mike the 5700. I may have read that on another board. Sorry, I can't recall or I'd link it.


Mike - I know you're a busy guy. You do so much to bring this hobby (more like a passion, actually) to a higher level. Thanks for all your hard work. We're all indebted to you. Let us know how well your new toys play when you're done with them this weekend. Oh, and please correct me if anything I wrote above is incorrect.


- Mark


----------



## mp20748

Mark,

Yes, and I'm testing a card for someone at Nvidia as well.


We're hoping to do this this weekend, or very, very soon after that.


Sorry it's taking me so long. I'm also very anxious to see what this card is all about.


----------



## Nich

I'll be looking forward to hearing what you think.



Nicholas


----------



## mp20748

OK, I just found out that the HTPC is ready. And that it'll have some Hi-def media on the hard drive.


"Fifth Element, Lord of the Rings, Gladiator, Jurassic Park III, The Transporter, The Lion King, The Fast and the Furious, and some other clips. Some are in Windows Media HD and some are the actual HD transport streams. The nice thing is that the ForceWare Multimedia Player can actually play all of these, so you don't need a seperate application for different media types."


It's Showtime!


----------



## mikecazzx

Quote:

_Originally posted by mp20748_
*OK, I just found out that the HTPC is ready. And that it'll have some Hi-def media on the hard drive.


"Fifth Element, Lord of the Rings, Gladiator, Jurassic Park III, The Transporter, The Lion King, The Fast and the Furious, and some other clips. Some are in Windows Media HD and some are the actual HD transport streams. The nice thing is that the ForceWare Multimedia Player can actually play all of these, so you don't need a seperate application for different media types."


It's Showtime!*
Nice, the sooner the better.


So if I buy a Radeon 9600 XT or PRO today and them have to buy a 5700 next week and take the Radeon back...how much CRT setup needs to be redone? Everything? Astig, focus, convergence...or just convergence?


----------



## genmax

Mike, assuming you did your astig at the highest frequency you will use, it shouldn't matter.


Just write down your powerstrip settings in case all is lost during the driver switch.


----------



## mp20748

It showed up today (thanks Doug), and it's ready!


Well, almost. I had to install the 5700 Ultra with the MP-1. Once that was done we spent about 30 minutes looking at hi-def movie clips, and Avia Pro.


I plan to spend more time on it tomorrow (Monday).


We also looked at a couple dozen Avia Pro test patterns on the scope. From what I saw so far, the DAC in the 5700 is very impressive. The picture was so punchy that I had to check several times to make sure we were not going above 700mv. The MP-1 was in the unity mode (non boost).


I'll pick up on this next week, and we'll finish up the testing on the G90. I don't want to say much for now, but will report back later with more on this really nice card.


So far I'm impressed, very impressed, but I must maintain my posture, because the real testing has not yet started.


----------



## stylinlp

Mike could I ask a question reated to your experience and subject? Good 

I noticed you mentioned Unity mode (non boost). WIth my Extron 120p I have 3 settings. Unity, 50% and 100% Peak level adjustment. I noticed at 100% it was ALOT sharpre picture and at 50% it was just a little sharper than Unity. But I started looking the images over and I think the 100% might be too harsh and artificial looking, non film like. Maybe. At 50% i get a little sharper picture but don't loose the filmlike look. Have a view point on this? Radeon9200se with 15' Canaire 5bnc to 5bnc cable.


2nd question: ABout the new 5700 Ultra. Which brand and model are you testing? Do these benifits there when used with Theater Tek and Ffdshow?


Thanks


----------



## mikecazzx

Quote:

_Originally posted by mp20748_
*It showed up today (thanks Doug), and it's ready!

So far I'm impressed, very impressed, but I must maintain my posture, because the real testing has not yet started.*
Is the MP1 mod still available for the Radeon 9800's?

Can it be installed by the end user?


----------



## j lyon2564

Mike ,

I have a 9000 radeon with the mp1 mod.I have been waiting for the video cards to be worth a noticable upgrade.I had an older gforce card with the filters removed and it did the job for quite some time.If your testing shows it is time and worth an upgrade what will it cost to move the mp 1 from my 9000 to the 5700 ultra?will you offer the mod to the 5900 ultras also.


john lyon


----------



## dokworm

I just hope the boards we can buy are end up as good as the reference boards that Mike is testing...

Don't forget to test 50Hz mode too Mike if ya get a chance, a lot of us PAL users live by it , and there were some reports here of trouble with the 5700 @ 50Hz.


----------



## Nima

And 47,952 Hz (2 times 23,976 fps). I use that.


@dokworm:

Why should there be problems ?


----------



## dokworm

I don't know why, but there are a few reports here of the 5700 having serious problems with DVD playback @ 50Hz


----------



## stylinlp

Jay over at TheaterTek forum reported thjs post


There is a new NVIDIA driver 56.64:


Release Highlights:


-Adds support for GeForce 5700 and GeForce 5700 Ultra

-MicrosoftÂ® DirectXÂ® 9 and OpenGLÂ® 1.5 support

-Supports application profiles for custom image quality and performance modes for all of your applications and games.

-NVIDIA nView 3.5 Multi-display technology

-Advanced MicrosoftÂ® Internet ExplorerÂ® popup blocker

-Industries only Display Gridlines technology

-Improved HDTV Y Pr Pb component out support for 480i, 480p, 720p, and 1080i formats**

-For a complete list of compatibility fixes please consult the v56.64 Release Notes below

http://www.nvidia.com/object/winxp_2k_56.64


----------



## mp20748

Quote:

_Originally posted by stylinlp_
*Mike could I ask a question reated to your experience and subject? Good 

I noticed you mentioned Unity mode (non boost). WIth my Extron 120p I have 3 settings. Unity, 50% and 100% Peak level adjustment. I noticed at 100% it was ALOT sharpre picture and at 50% it was just a little sharper than Unity. But I started looking the images over and I think the 100% might be too harsh and artificial looking, non film like. Maybe. At 50% i get a little sharper picture but don't loose the filmlike look. Have a view point on this? Radeon9200se with 15' Canaire 5bnc to 5bnc cable.


2nd question: ABout the new 5700 Ultra. Which brand and model are you testing? Do these benifits there when used with Theater Tek and Ffdshow?


Thanks *
Unity gain usually means that the circuit in question does not effect the gain level of the signal. On the MP-1 there's a jumper that will allow for a subtle boost of the output level. When the jumper is removed, the mod is in the non boost mode (700mv into the mod - 700mv on the output).


The boost switch (cable compensation) on the Extron was used for very long cable lengths (far beyond what we use in HT). With very long runs of cable, the signal would get attenuated, also causing high frequency roll-off. Or better put, loss of sharpness and detail. The "peaking" control on the Extron was used to increase or boost the high frequency that was lost from cable attenuation. This boost would restore sharpness and detail to the image.


This was fine for computer graphic display, but for hi-end video, it's nothing but a distortion control. And there's absolutely no way to use one of these controls (in video) without including some level of distortion ("harsh").


In hi-end HT, the goal is to use very good grade cables. And because of the short cable distances that we use in HT, there's no need to use a noise inducing circuit as such, if the cables are of good quality. The key is good cables supported by good bandwidth = equals good image


The brand 5700 Ultra is an ASYLUM. I also have another brand here, that only has Nvidia on it.


John,

the 5700 is very complicated to mod. It's not as easy as the ATI's. Much more has to be done. For now I'm thinking it might be best to you the MP-1.3 (external to the card) with the 5700. This might change once I spend more time on this...


Speaking of time. It may be another couple of days before I'll be able to get back on this. I've been very busy with so many legal/medical/relocating issues, that I'll have to spend the next couple of days getting on top of my backlog. I have so much work, that I had to stop taking in more (until first week in April).


We hope to pick back up on this by this weekend. I'm really looking forward to it.


Keep in mind, that I'm only testing the video performance of the DAC. Vern Dias will pick up with other issues, such as refresh rates, etc.


----------



## mbrandt

Thanks again Mike. Best of luck...


- Mark


----------



## Samhain_777

Quote:

_Originally posted by mp20748_
*Unity gain usually means that the circuit in question does not effect the gain level of the signal. On the MP-1 there's a jumper that will allow for a subtle boost of the output level. When the jumper is removed, the mod is in the non boost mode (700mv into the mod - 700mv on the output).


The boost switch (cable compensation) on the Extron was used for very long cable lengths (far beyond what we use in HT). With very long runs of cable, the signal would get attenuated, also causing high frequency roll-off. Or better put, loss of sharpness and detail. The "peaking" control on the Extron was used to increase or boost the high frequency that was lost from cable attenuation. This boost would restore sharpness and detail to the image.


This was fine for computer graphic display, but for hi-end video, it's nothing but a distortion control. And there's absolutely no way to use one of these controls (in video) without including some level of distortion ("harsh").


In hi-end HT, the goal is to use very good grade cables. And because of the short cable distances that we use in HT, there's no need to use a noise inducing circuit as such, if the cables are of good quality. The key is good cables supported by good bandwidth = equals good image


The brand 5700 Ultra is an ASYLUM. I also have another brand here, that only has Nvidia on it.


John,

the 5700 is very complicated to mod. It's not as easy as the ATI's. Much more has to be done. For now I'm thinking it might be best to you the MP-1.3 (external to the card) with the 5700. This might change once I spend more time on this...


Speaking of time. It may be another couple of days before I'll be able to get back on this. I've been very busy with so many legal/medical/relocating issues, that I'll have to spend the next couple of days getting on top of my backlog. I have so much work, that I had to stop taking in more (until first week in April).


We hope to pick back up on this by this weekend. I'm really looking forward to it.


Keep in mind, that I'm only testing the video performance of the DAC. Vern Dias will pick up with other issues, such as refresh rates, etc.*
Hi,


I am planning my first HTPC to hook up to my cieling mounted XVZ9000 1280x720 DLP projector. I have a few Q's I am hoping someone in the know can answer for me first through:


I understand the MP-1 mod boosts signal strength for long runs (though from the above post I understand this is at the cost of image quality?).


I plan to run a 10m VGA cable through the cieling to the projector, and am wandering if this will require an MP-1 mod to maintian signal strength.


I am not the sort ot skimp badly on cables, though by the same token, don;t buy monster cables either as I feel the price performance of that escelon to be poor....


Obviously if I can get away with a $100 AUD card with no MP-1 and maintain high image quality, that's what I want to be doing versus hundreds more for an MP-1 modeed card (can I buy this direct from the states btw??)


So yeah, long run is how long exactly? and when do I need to be thinking about an MP-1 - in fact, perhaps the MP-1 mod is more suited to CRT display devices?


Thanks for your help all.


----------



## WanMan

I think the MP-1 _has_ the ability to boost signal (line-level voltage) if jumpered to do so.


----------



## JBJR

The MP-1 mod is a RGBHV output mod only. Mike does not like VGA connectors.


John


----------



## WanMan

John, isn't VGA RGBHV on a PC?  I think you are confusing physical interfaces with video signaling. The MP-1 uses five (5) BNC connectors and not a DB-15, and the video signal is RGBHV all the way.


----------



## pcgeek

Quote:

_Originally posted by Samhain_777_
*Hi,


I am planning my first HTPC to hook up to my cieling mounted XVZ9000 1280x720 DLP projector. I have a few Q's I am hoping someone in the know can answer for me first through:


I understand the MP-1 mod boosts signal strength for long runs (though from the above post I understand this is at the cost of image quality?).


I plan to run a 10m VGA cable through the cieling to the projector, and am wandering if this will require an MP-1 mod to maintian signal strength.


I am not the sort ot skimp badly on cables, though by the same token, don;t buy monster cables either as I feel the price performance of that escelon to be poor....


Obviously if I can get away with a $100 AUD card with no MP-1 and maintain high image quality, that's what I want to be doing versus hundreds more for an MP-1 modeed card (can I buy this direct from the states btw??)


So yeah, long run is how long exactly? and when do I need to be thinking about an MP-1 - in fact, perhaps the MP-1 mod is more suited to CRT display devices?


Thanks for your help all.*
This is probably the wrong place to ask about cabling for a DLP projector ;-) But to be a little more useful, why would you run RGBHV to a DLP projector? If you've got a HTPC, you'll be a LOT better off going DVI at the native resolution (in which case the MP-1 mod will do nothing for you). I'm not sure about cable quality for DVI but I assume it still matters to some degree.


----------



## mikecazzx

Quote:

_Originally posted by pcgeek_
*This is probably the wrong place to ask about cabling for a DLP projector ;-) But to be a little more useful, why would you run RGBHV to a DLP projector? If you've got a HTPC, you'll be a LOT better off going DVI at the native resolution (in which case the MP-1 mod will do nothing for you). I'm not sure about cable quality for DVI but I assume it still matters to some degree.*
Here is what I have gathered.


I think the MP1 was for RGBHV.


Making VGA to RGBHV into RGBHV to RGBHV and providing a better power source or true 75 ohms for improved color, and less noise.


Also, it would improve cable runs over 20'-30' long.


Some claim its a godsend, others claim its a minor tweak.


----------



## ChrisWiggles

John is correct: the MP1 uses BNC RGBHV connectors, instead of a VGA plug, for higher quality. The signal, as you note, is still the same type, but the connection type is arguably superior, and there is no need for a VGA-RGBHV(BNC) breakout cable.


----------



## WanMan

I went back and re-read John's post. I interpreted it incorrectly the first time I read it. He wasn't implying VGA & RGBHV as being different, but rather that the mod is only for the RGBHV signal's physical interface.


----------



## JBJR

WanMan, correct, Chris is also correct about the mod. Once Mike puts the mod on the video card the VGA output on the card is disabled and only the DVI output on the card works.


John


----------



## mbrandt

I think Mike best describes the benefits of the MP-1 in this thread...

http://www.avsforum.com/avs-vb/showt...ght=buffer+mp1 


In summation...

- The VGA pathway is bypassed to RGBHV, giving you the opportunity to use 75 ohm connections on both ends

- The new RGBHV pathway has a buffer circuit installed to take the increased capacitance of longer cable runs (15'-30') into account, giving you more detail

- The DVI pathway is left as is


Mark


----------



## jcmccorm

Mark, and add to that nice summation that noise is reduced with the MP-1.


Cary


----------



## Samhain_777

ok, thanks all for further info and especially for the pms.... as I am new to the forum this is all good stuff for me 


Ok... DVI - I would love to use DVI but my z9000 dosnt support DVI. The best I can do is RGB or Component.


So I need to go direct from a HTPC to RGB or Component - RGB obivouly would be easier since the PC would be outputting this (I understand RGB is the better of the two byu a small? margin).


MP-1 mod - ok sold... I am sure I need one of these and was sure that a stock card wouldn't have made the 10m run (or would have been completely amazed if it did).


I plan to get an MP-1 modded Radeon (htough are the modded NVidia based cards are now thought to be better??? is this right??) and run it up to the DLP projector OR install the PC in the cieling and not use an MPI, but that makes it hard to muck aboutw ith if I want to (It will be compeltely RF controlled BTW).


I wonder how much a difference there would be with a good quality 2m run of VGA cable (if PC in the cieling) vs a 10m run of MP-w modded RGB (if pc in the entertainment rack)??


Where can I pick up an MP-1 modded card from AUS or from the US? - don;t mind buying from the states, as long as it comes out cheaper than the high prices being asked here in AUS.


OK here is some added complexity, since I have some of your much apreciated attention 


Part of this HTPC project is to move ALL HT equip OUT of my lounge and into the room behind the lounge wall (remote controlled via my Pronto Pro and RF extender). This includes the HTCP.


I plan to run an external DVD writer into the lounge (through the wall) for loading DVDs - that being the ONLY component remaining in the lounge.


Now with IDE run length isues aside - I understand it is possible to connect a DVD rom drive directly to an X-Card (or something like that) so that the H3d2 card can do it's magic?


If not, then I wont be able to utilise the H3d2 card for DVD as I specifically want to use only a small ROM drive in the lounge mounted behind myscreen on the wall.


If this is true, can I still burn DVDs with tihs same drive? - if not, I will just use a ROM drive and burn DVDs on another machine over the network...


If this is possible, will this permit 580i/p and signals to be processed or just 480i/p ?


Think I might of confused things there - though finding this info over the last few months has proven somewhat difficult since there seems to be a lot of differing opinions, and also a certain amount of "assumed knowledge" in a lot of cases, which I do not have :-(


Thanks for your help guys, it's much apreciated!


----------



## dokworm

PM me if you want an MP-1 in Australia.

I finally bit the bullet and bought one from a user here - and it didn't make a difference I could see on my short cable run (approx 40cm or 15inches), vs the radeon 9200 I had with the filters removed - so I am happy to part with it to a fellow Aussie.

It may serve you well with your longer cable run. Although if you use very high quality cables, 10m shouldn't be too drastic. I'm hoping to sell it and try out a 5700.


If anyone else in Oz wants it - drop me a PM as well.


On your other issues, an SATA DVD burner will give you a longer cable run than IDE, or a scsi burner will also extend your range a bit. If you are going to hide your PC away, why not put it next to the projector and bypass the RGB long cable issues?


----------



## dokworm

MP-1 has sold.


----------



## Nich

The new Nvidia drivers are out now. Let's see how they work for the

5700.


Nicholas


----------



## maneuen

Nich - what player are you using? Wasn't there a problem w/ the 5700 and Theatertek originaly? Anyone know (or want to find out for others w/ theatertek) if the new drivers helped to sort that out.


----------



## Nich

I use powerDVD 5.0.


Nicholas


----------



## mp20748

I've been very busy, but was able to look at the 5700 on a G90 over the weekend.


This was supposed to have been the initial testing of the 5700. However the testing did not go well, because the HTPC for some reason was not working properly with the 5700. It was not displaying what Doug and I was seeing when we first put the card in the HTPC. Not saying what was the problem, but I think that either the card was not properly seated in the HTPC or somehow the drivers needed to be reloaded.


Anyway, I plan to redo the testing. And by then the HTPC will be fully functioning (as it was when we first installed the card). So far I still like the card, mainly because it displays the cleanest signal I've experienced with any card. I say that based on using Deskmate test patterns (non DVD software player). And the best test for a DAC is Deskmate or some other non DVD player test pattern software.


The DAC in the 5700 is very powerful, clean and the absolute sharpest... Therefore I'm pretty confident that this card may be a serious contenter, but I'm also expecting that there may be some driver upgrades and patches that will have to be installed before the cards reaches its maturity.


So my testing will resume when all problems are solved, and when I'm confident that the best drivers are installed to compliment the performance of the DAC.


----------



## pasey25

sounds good mike.


we all appreciate your vigour.


I'm waiting with cash in hand for the verdict.


----------



## Chuchuf

OK just had an opportunity to install an Asylum 5700 Ultra in my HTPC and here are my initial observations.

* The desktop is noticably sharper than the ATI card. It has cleaner edges and deeper looking colors.

* I only just did a quicky calibration by eye using The Fifth Element to set Overlays. Picture looks very good as well. I am getting some shuttering and micro shuttering every once in a while that I wasn't getting before. Have not run test patterns. Someone suggested that power source could be the cause of the shuttering. Will check that.



Terry


----------



## Vern Dias

Terry, if you are using PowerStrip, you will need to redo the timings to achieve 71.928 or whatever the correct multiple of the frame rate is for your setup.


The clock timing on the NVidia are just slightly different than the ATI.


Vern


----------



## jcmccorm

Vern, did you mean 71.928 or was 71.910 on purpose? Thanks.


Cary


----------



## Vern Dias

Sorry, 71.928 is correct. Not sure what I was thinking... Fixed the post.


Vern


----------



## Chuchuf

Vern,


I suspected that could be the problem as well. I was using the NVidia resolutions. The version of PS I have won't accept this card so I have to get the newer version.

I am going to test with a customers HTPC I just built on my system in the next day or so. Very nice so far.


Terry


----------



## mikecazzx

Quote:

_Originally posted by Chuchuf_
*OK just had an opportunity to install an Asylum 5700 Ultra in my HTPC and here are my initial observations.

* The desktop is noticably sharper than the ATI card. It has cleaner edges and deeper looking colors.

* I only just did a quicky calibration by eye using The Fifth Element to set Overlays. Picture looks very good as well. I am getting some shuttering and micro shuttering every once in a while that I wasn't getting before. Have not run test patterns. Someone suggested that power source could be the cause of the shuttering. Will check that.



Terry*
No chance a recovergence has made it sharper...would'nt you have to do one with a new card?


(I duck and run for cover)


----------



## mikecazzx

Quote:

_Originally posted by Vern Dias_
*Terry, if you are using PowerStrip, you will need to redo the timings to achieve 71.928 or whatever the correct multiple of the frame rate is for your setup.


The clock timing on the NVidia are just slightly different than the ATI.


Vern*
This is a good point...I used Pstrip and went 72 with my Radeon - do we actually need to get it down to an exact 71.928?


I clearly missed this thread before.


----------



## Vern Dias

Terry, as you have discovered, older versions of Powerstrip don't support the Nvidia 5xxx cards. When I got mine, I had to dowload the latest beta of Powerstrip, but now you can just request the online update to the current production version and you will be fine.


Mike, no need to duck and cover.


I have seen that running the exact 71.928 makes a difference in micro and macro stutters.


However, good luck getting to that exact refresh, it's voodo and a lot of trial and error.


Vern


----------



## cpurvis

Actually, I have always found it easy to get 71.928 exactly - just incrementally reduce your sync width by one pixel at a time and try typing in 71.928 each time you do it. You should only have to adjust it a couple of times before it will accept that exact timing. The minute adjustments do not appear to the PJ as a new resolution so it doesn't really affect the picture.


----------



## BangoO

Quote:

_Originally posted by Vern Dias_
*I have seen that running the exact 71.928 makes a difference in micro and macro stutters.*
Not if you use ReClock. I prefer setting the refresh rate at 72Hz and using ReClock, the result is far better on my PC.


----------



## Vern Dias

Sorry, but I do use reclock, and running at 71.928 eliminates reclock having to resample the audio to match the non-multiple of film on NTSC refresh rate


Vern


----------



## BangoO

Sure, but this is ReClock job to resample the audio to 24fps instead of 23.796fps, and it does it perfectly so why bother ?


----------



## Vern Dias

But it doesn't do it perfectly all the time. At least I can sometimes hear the drops and repeats.


Vern


----------



## Mark_A_W

If you actually go a TINY little bit under 71.928, say 71.925, you will force reclock to drop frames more often that it repeats - drops are far less audible.


Food for thought, especially if you can't coax the exact number out of powerstrip.


----------



## Chuchuf

Got it w/ Powerstrip today. So far looks very good. Will know more after tonight.


Terry


----------



## mikecazzx

Quote:

_Originally posted by Mark_A_W_
*If you actually go a TINY little bit under 71.928, say 71.925, you will force reclock to drop frames more often that it repeats - drops are far less audible.


Food for thought, especially if you can't coax the exact number out of powerstrip.*
Not sure if I am not seeing this micro stutter stuff - sounds like I dont want to know.

I am at 72


----------



## Mark_A_W

It's audio DD frame dropouts I was talking about Mike.


If your happy, don't change anything.


----------



## bblue

Has anyone compared this to a Matrox Parhelia?


I've been using the latter for several months now after having upgraded to it from a Radeon 9800 Pro. At that point in time there was an improvement in overall detail and certain ringing type artifacts were eliminated. I had removed the high frequency output filtering and converted it to BNC output.


Yesterday I installed the FX5700 Ultra in its place and viewed a number of test displays and a couple of 'old standby' DVD titles as a comparison. Funny thing is that it seems slightly clearer, but not sharper. Colors seem slightly richer, and low level black detail is slightly improved. This is without the removal of high frequency output filters, and without the conversion to BNC. Just a VGA->BNC cable driving my 30' run to the projector.


Seems like it would benefit some (in terms of sharpness) with removal of the HF filters and the addition of a low impedance wide bandwidth buffer amp like the MP-1 for driving the cable runs, but it's very good as-is. Does this gel with others' experiences?


It is amazingly close in performance to the Parhelia, however, for 2/3 the price.


--Bill


----------



## jcmccorm

Thanks for the info Bill. I picked up a 5700 recently but I've been reluctant to install it since TheaterTek apparently doesn't happily coexist with it from what I've read and I haven't explored other playback options yet.


Are you just using a VGA breakout cable with BNC couplers to your 30' run? I also have a 30' run of Belden 1694A to my projector and was hesitant to drive it with a breakout cable for fear of impedance discontinuities at the couplers.


Cary


----------



## bblue

Cary,

Yes, that will degrade things but I didn't want to go through all the mods without seeing a tantalizing reason to. Without a good reason I'd take it back and re-install the Parhelia. But it definitely shows promise. I just have to watch it some more to try and quantify just what it is I'm seeing.


It would be interesting to see how yours does with a similar disadvantage compared to what you are used to, and knowing that it would (presumably) work better with the right connections and driver amp.


Be sure and use firmware 56.64 on the web site, not what's on the disc. Actually, there's nothing on the disk you need except possibly version 2.55 of their player.


I also found that it seems to support a lot more modes with the latest version of PS and they were easier to get set correctly. I can't imagine why TT would have a problem with it, but have heard that before. I'm using ZP Pro with the WinDVD5 decoders.


--Bill


----------



## mikecazzx

Quote:

_Originally posted by cpurvis_
*Actually, I have always found it easy to get 71.928 exactly - just incrementally reduce your sync width by one pixel at a time and try typing in 71.928 each time you do it. You should only have to adjust it a couple of times before it will accept that exact timing. The minute adjustments do not appear to the PJ as a new resolution so it doesn't really affect the picture.*
Mine seems to be working at an even 72. What is the issue I should be looking for?


----------



## Mark_A_W

Mike, your making reclock work harder and repeat DD audio frames by running a 72hz refresh rate instead of 71.928.


At 72hz reclock will work properly, so you won't get judder/drop frames, but you will get more audio frame repeats - which may be audible.


Mark


----------



## cpurvis

Quote:

Mine seems to be working at an even 72. What is the issue I should be looking for?
Nothing - if it looks good to you don't mess with it.


----------



## bblue

Quote:

Actually, I have always found it easy to get 71.928 exactly - just incrementally reduce your sync width by one pixel at a time and try typing in 71.928 each time you do it. You should only have to adjust it a couple of times before it will accept that exact timing. The minute adjustments do not appear to the PJ as a new resolution so it doesn't really affect the picture.
I have *never* been able to get exactly 71.928. Maybe you're just using a screen size that lends itself to being close?


Mine is currently at 1440x810 and the sync parameter is at five. Reducing it one by one and re-entering as you describe gets numbers that are close, but never that one. Same test up and down with back and front porch as well. 71.935 is usually the closest I can get and will result in about 30 drops in a 2 hour movie, and I only notice 2 or 3 of those.


Any other technique you have used to accomplish this? If you can't get it exactly, wouldn't it be better to be low than high?


--Bill


----------



## bairda

I just got a 5700 to replace my Radeon 9600. Everything looks ok, except that the picture inside of both TheaterTek and NVDVD is zoomed in any interlaced resolution. When i use a non-interlace resolution everthing looks good. But something along the lines of 1920 X 1080i will give this zoomed in picture on either of these DVD players.


Anyone know what is going on?


Thanks,

-Alex-


----------



## Pedro-in-Oz

Now that is wierd - I would contact Theatertek, they have been very helpfull in the past. How would you rate the 5700 over your radeon?


----------



## WD-40

To bring this thread back-


Does anyone know if any low profile (half height) cards are available, based on the 5700 core? I wouldn't need a second display or a TV out, just one VGA port.


- David


----------



## BangoO

There is still something that is not clear... I read that NVidia says that the MPEG2 core is the same on the 5600/5700/5900, then I read that the 5700 doesn't have the same MPEG2 core as the others, and therefore is way better... so... are they all the same or are they different ?


----------



## WD-40

And, has anyone found out- Is it the output stage on the 5700's specifically that is good, or would a card such as the 5900 have the same output quality?


- David


----------



## JeffY

Quote:

_Originally posted by BangoO_
*There is still something that is not clear... I read that NVidia says that the MPEG2 core is the same on the 5600/5700/5900, then I read that the 5700 doesn't have the same MPEG2 core as the others, and therefore is way better... so... are they all the same or are they different ?*
Yes I can definately say the mpeg decoding on the 5700 is just as crappy as the 5200. You just have to forget about DXVA.


----------



## Vern Dias

What I can't understand is why anyone today cares about DXVA, anyway.


As soon as you factor FFDSHOW or the FWMM postprocessor into the graph, you can't use DXVA anymore, so it's certainly a non-issue for me.


I would think it's also a non-issue for anyone seeking the highest quality image possible fron an HTPC.


Vern


----------



## PJ Adams

Bango-O - I posted the same question with regards to the MPEG engine a few weeks ago in the HTPC forum.


There was a bit of a 'rumor' floating about that the 5700 had a superior MPEG engine, but it turned out to be not true.


The entire 5xxx series share the same MPEG engine BUT the 5200 when compared to the others does not have the processing speed to fully utilize the benefits of the MPEG engine. Therefore, minimum recommended card, (for what we want), is a 5600.


In theory - all the cards from 5600 upwards give the same output (with regards to what we want & via the VGA port), but I do not read anything about the 5600 - everyone seems excited about the 5700


I'm still undecided...but then again, I've not even seen one in action...


Paul


----------



## genmax

I noticed considerably LESS noise switching from a Radeon 9000 to an FX5700u. Theater Tek looked like crap but when I switched to ZoomPlayer with the Nvidia 3 decoders it really looked great.


----------



## VideoGrabber

PJ Adams wrote:

> The entire 5xxx series share the same MPEG engine BUT the 5200 when compared to the others does not have the processing speed to fully utilize the benefits of the MPEG engine. Therefore, minimum recommended card, (for what we want), is a 5600.


----------



## WD-40

Thanks guys, the discussion here has answered all of my questions.


- David


----------



## ChrisWiggles

It hasn't answered mine! Now I'm confused, maybe I missed something. Should I be looking at a radeon or an nvidia? Keep in mind I have 25 feet of belden RGB cable run, so do you think it's worth it to go for an MP modded card?


----------



## BangoO

Quote:

_Originally posted by PJ Adams_
*Bango-O - I posted the same question with regards to the MPEG engine a few weeks ago in the HTPC forum.


There was a bit of a 'rumor' floating about that the 5700 had a superior MPEG engine, but it turned out to be not true.*
Then, it's funny because people started talking about NVidia overtaking ATI only since the 5700 is out... The 5900 is out since a long time and I never read that it was better than a Radeon !


----------



## BenY

It seems that the Nvidia test trails went nowhere and the testers are embarrassed to admit it...

Hang on to ATI...


anyone !?!?!


----------



## dokworm

Has anyone e-mailed theatertek re the problems?


----------



## CaspianM

Quote:

_Originally posted by ChrisWiggles_
*It hasn't answered mine! Now I'm confused, maybe I missed something. Should I be looking at a radeon or an nvidia? Keep in mind I have 25 feet of belden RGB cable run, so do you think it's worth it to go for an MP modded card?*
Right now I am using a Nvidia 5700 in place of my 9500 Pro with my XG.


I am not a technician and Mike's findings might be vastly different.

Running both the Radeon and 5700 at the same time and switching back and forth between two computers, I like the 5700 better by a good margin. My screen size is 100" diagonal. First off, it has more natural colors than Radeon. Second the picture looks denser than the Radeon's. Overall it gives a better sense of depth and the back ground noise seems better as well in low level contrast clips. Having said that, this card has a different contrast/gamma and gray scale output. But once that is set up in the PJ it should be a better card imo. My Radeon wan not modified so perhaps comparing modified cards bring a different results, I simply don't know.


----------



## genmax

Did you notice less noise coming from the Nvidia? I saw similar qualities as you, but there was a lot more noise coming through the (my) Radeon.


----------



## Chivs

My understanding is that FX5200, 5600 and 5700 share the same MPEG core, however the FX5900 uses an older generation. Take this for what it's worth.


Regards,


----------



## Jerry Pease

Quote:

_Originally posted by BenY_
*It seems that the Nvidia test trails went nowhere and the testers are embarrassed to admit it...

Hang on to ATI...


anyone !?!?!*
it would be nice to know the scoop


----------



## BenY

Or better yet,They have found that Nvidia had installed RGB amps into the cards so there`s no need for a mod anymore... (-:


----------



## mp20748

Testing will resume when Deniz gets back from vacation.


But for now, and from what I've observed so far, I have a 5700MP-1 that's going into my new HTPC case. And I'll tighten the screw that holds the card in.


----------



## VideoGrabber

BenyY commented:

> It seems that the Nvidia test trails went nowhere and the testers are embarrassed to admit it...


----------



## Vern Dias

The reason I haven't posted any test results is simple: I don't yet have a card to test.


I have posted the results on my non-modded 5950 Ultra in the past. It pretty much agrees with what others have said here, better resolution, sharper desktop and a very pleasing rendition of DVD's.


Vern


----------



## Chuchuf

Andrew (Theater Tek) and I did some testing on the card with the Sonic filters he uses a week ago and a BFG 5700. Test on things like Moving Plate High produced some very good results.

I have been running it in my HTPC since without difficulties. It's a but different to set up the overlays on but once done results are very good.

Desktop is definately cleaner.

I like MP also like the overall design of the card and choice of components. Can't wait to see one with his mods when he get time to do them.


Terry


----------



## BangoO

Which 5700 is supposed to be the best for home theater use ?

Asus, Abit, Chaintech or MSI ?

The Chaintech has an additional power cable, so maybe it can make a difference ?


Thx


----------



## BangoO

Come on guys...


----------



## CaspianM

I don't think you can find nvidia's Mike Parker has. I have the Asylum by BFG Tech which seems like a good card. MP has also one of these is what I read.


----------



## mp20748

Now we can resume the testing of the 5700, and we're planning on doing that this Tuesday.


Sorry for the delay on this, but in order to properly evaluate this card, I'll need to have certain things in order first. The main thing is to be able to get the software DVD player to function at or near 100% (This is where Doug comes in). This is important because I'll have to get a 20Mhz signal from a DVD to display square wave pattern on the scope. And since the testing involves the MP-1, there would probably need to be a change made in the filtering network on the MP-1. Then the 5700/MP-1 would have to be tested with a 30' 75 ohm RG6 cable (Belden 1694A).


Once the software player is functioning properly and the scoped patterns meet my expectation, we'll arrange for the final testing on Deniz's G90.


BTW, Deniz (formally of T A W), is probably the most critical eye that I know of other than myself. He has a tremendous ability to see and understand the most critical elements of video performance. Actually it's a real pleasure to have him present on this. So it's not just his G90 that we're utilizing.


----------



## ChrisWiggles

Thanks for the update Mike, I'm glad you are doing better now, and I'm excited to decide what kind of video card I will get for my new HTPC. Do you know what kind of timeframe your modded 5700s will be available?


I'll wait to see what kind of results you see during your testing!


----------



## Steve_T

Any update Mike


----------



## mp20748

Yes, we did get to find time to initially test the 5700 with the latest drivers. So far we've not noticed any REAL difference with the latest drivers (concerning the issues we noticed earlier).


However, we were able correct the problem we were having from the earlier testing. Now we're able to get good and stable performance.


Still, we have a few more things to do before we can really comment. But as I've indicated before, I'm sure it's going to be a 5700 for me.


----------



## blafarm

Slightly OT but I've been told the eVGA version of the 5700 Utra has a rather noisy fan and was wondering if anyone has had better experiences with any other brands. I'm installing in a pretty quiet SFF machine that lives in the theater so noise is an issue.


Thanks for your suggestions.



Edit: And, while we're on this subject, as there seems to be some slight build variations between manufacturers - is there one brand worth aiming for?


Thanks


----------



## Jerry Pease

Quote:

_Originally posted by blafarm_
*Edit: And, while we're on this subject, as there seems to be some slight build variations between manufacturers - is there one brand worth aiming for?


Thanks*
echo


----------



## blafarm

Boy, this thread really fell off a cliff, didn't it.


----------



## mp20748

Quote:

_Originally posted by blafarm_
*Boy, this thread really fell off a cliff, didn't it.*
Yeah, but only for a moment.


I've suspended my test on the 5700, and will probably post back on its performance a little later. For now I've also taking on a 5950, so my actual final testing would involve the 5700 as well the 5950.


So far, I find the 5700 (when modded with MP-1) to have a very low noise floor. Therefore it blows the ATI 9800 away as far as basic DAC performance. In other words the analog signal on the 5700 is extremely clean. In my opinion the Matrox Parhelia had the cleanest analog signal that I've tested so far, but that was before 5700 arrived. The 5700 has the cleanest analog signal that I've seen from any DAC...


Now concerning the 5950. I only got a brief moment to look at the 5950, mainly because I had inadvertantly caused the card to fail, therefore it had to be sent back for replacement. From my initial visual observation of the 5950, I could see a very hefty and well built card. Very nice (a tad too noisy) dual fan heat sink assembly, but much needed for the processor. What really jumped out at me was the attention that was giving to the mini switching power supplies that are perfectly located on the PC board, and are not near to any analog section of the card. Truly the designers had high performance graphics in mind, mainly because any digital (or other) noise in a high resolution analog graphics signal would definately effect the best possible analog signal from the card. Finally, someone decided to not just randomly place the components on the card. Technically speaking, the layout is very well done. When integrating analog and digital signals on the same board, much concideration should be taking to avoid the trauma of a mixed signal invironmemt. Of course a design like this would only be found on expensive PC board layouts, and would NOT likely be seen on a consumer video card... well, not until the 5950.


I can only look at noise and other stuff using my scope and a 17" monitor. I did not get a chance to take the test to my shop (card failure - my fault). The card was immediately called in for an RTA, and was shipped out to next day.... yes I feel like someone has put me on hold.


NVIDIA seems to have an attitude, because it shows in the 5700 and 5950. If they're not ahead of ATI - they will be very soon.


----------



## VideoGrabber

Thanks for the update, Mike.


> I can only look at noise and other stuff using my scope and a 17" monitor.


----------



## Jerry Pease

Quote:

_Originally posted by mp20748_
*NVIDIA seems to have an attitude, because it shows in the 5700 and 5950.*
can you recommend any brand/implemetation????


----------



## mcpherv

Mike, the next generation of cards are basically out, and Nvdia is still behind in terms of performance. Their cards are also geared towards texture performance instead of shader performance right now, which is backwards looking, unlike ATI who are doing the opposite. Please, lets continue to be subjective in all of this. There are a lot of politics in the video card world right now (visit www.rage3d.com, or www.nvnews.net ) - lets stick to the 2d video performance please.

Vic


----------



## pcgeek

I'd love to see any info on the 6800 cards as well since it's supposed to have video processing capabilities (which I don't believe the ATI's picked up). If the quality holds from the 5700 to the 6800 on the signal then I think we've got the makings of an awesome platform (now just need to get ffdshow doing offloading to the 6800 cards  )


----------



## dokworm

I think that Mike was leaning towards NVidia, as he believes they now have a better 'physical design philosophy' than ATI rather than being necessarily a features/performance comparison.

Tom's Hardware has a good in depth review of both cards and their underlying technology. www.tomshardware.com 

I repeat the call though fopr the people that run the avsforum to put a request through to Tomshardware that they review 2D quality and DVD playback performance/quality as well as the usual performance tests. Tom's have been responsive in the past when suggestions come from 'informed' groups...


----------



## mp20748

Quote:

_Originally posted by mcpherv_
*Mike, the next generation of cards are basically out, and Nvdia is still behind in terms of performance. Their cards are also geared towards texture performance instead of shader performance right now, which is backwards looking, unlike ATI who are doing the opposite. Please, lets continue to be subjective in all of this. There are a lot of politics in the video card world right now (visit www.rage3d.com, or www.nvnews.net ) - lets stick to the 2d video performance please.

Vic*
If you notice from my post, my observations has been on the performance of the DAC's analog signal quality. In comparison to the DAC's analog signal quality of the ATI's, the Nvidia's 5700/5950 far outperforms the ATI's (my opinion only). Concerning 2D/3D performance, I have no idea what either of them are doing., I've yet to really evaluate the card in action on DVD with the latest upgrades. So far I've only mentioned the analog signals performance.


I don't even get into loading the drivers, deinterlacing, etc. That will probably be done by someone else. As for the ATI's being better performers than the 5700/5950, I would disagree with that. And since I have no interest in what either of them are doing as gaming cards, that excludes me from the politics that you mention. As for HT use, the goal for us CRT users is the quality of the analog signal, we want the cleanest signal possible, and if the DAC's signal is not up to par, whatever else the card is good for would make the final signal also sub-par.


The card was offered to me to evaluate the analog section only. So far, the signal quality of the DAC's analog out is the best I've looked at--period.


----------



## mcpherv

Okay Mike, sorry I misunderstood on that one. I thought that might be what you were getting at, but it seemed like you had already covered that one generally in your post, and the last comment was on something else. People buy cards on your recommendation, so I just didn't want any confusion on that one. Thanks,

Vic


----------



## hdtv_lover

Mike Parker,

As one who is keenly interested in home theater (primarily) as well as 3D performance I'd like to give an extra bit of info here. Nvidia makes the chipsets and will put out a basic design to it's board manufacturers. The manufacturers don't have to use that design at all. Lots of low-ball price cards use very substandard parts causing grief in all sorts of applications. So it is very important that you let us know which vendors card you tested, as the DAC is only one piece of the puzzle -- a critical piece but still only a piece.


I don't mean to imply that all low-ball cards are junk -- sometimes they are the best, but one *really* has to do their research not to get stung at any price. Yes, it's a LOT of trouble but don't skimp on the $40-300 card when you've just paid multi-thousands on your pj.


HDTV Fan

The oil company needs some hangin's.


----------



## mp20748

HDTV Fan,

You're right, and for some reason I keep forgetting how these cards are manufactured - thanks.


I'm gonna put the cards on the table with this one... I'll get into the manuafcturer later, and maybe then I'll be able to do a better job at conveying my experience with it. I'll say this for now, there will be a photo of it on my website shortly after I get it back into my hands..


----------



## vairulez

anything new ?


----------



## mp20748

Yes, I've got the 5950 back from the manufacturer. I'll pick back up on this next week after I get the mod attached to the card.


----------



## Budget Pete

Is the 5700 signal from the DAC so clean that it wouldn't require filtering?


----------



## mp20748

Quote:

_Originally posted by Budget Pete_
*Is the 5700 signal from the DAC so clean that it wouldn't require filtering?*
All DAC's should need filtering.


A DAC is a mixed signal (analog/digital) device. By having high speed digital signals on the same platform as analog signals, you're get some noise from the digital into the analog. This noise is usually above the video bandwidth, and is (in some cases) so high in frequency that it could even cause RF interference in radio,s TV's, phones, etc. The purpose of the filtering is strip as much as possible the frequencies above the useable video band. More commonly known as a "Low Pass filter"


All DAC's should have this filtering. By removing the filters, you'll get the appearance of a sharper image, but that may not be the case, because some DAC's have a somewhat higher (emphasized) pump on it's analog output. This slight boost or pump is there to maintain the video signals performance through the filtering process. In other words the filters will restore that high frequency boost back to the original signals characteristics. Without the filters, you'll more like get a perceived sharper image, but it will more like suffer from over-peaking like sharpness, and would more like have the higher frequency noise that you'll rather not see in the image. most of us will focus on the sharpness and not notice the noise or over-peaking effect.


The goal is the get a CLEANER signal, and this would require filtering. A cleaner signal is better for HDTV, mainly because with HDTV you'll want the cleanest signal possible, as to be able to properly display subtle detail and definition. HDTV from a DAC without filtering can cause the noise and peaking effect to deminish the detail of the image. And the intent of modern video is to be able to see the image in full detail.


The MP-1 does have filtering in place.


----------



## mp20748

Quote:

_Originally posted by mp20748_
*Yes, I've got the 5950 back from the manufacturer. I'll pick back up on this next week after I get the mod attached to the card.*
The mod was attached, and the 5950 was put in the HTPC. Thanks once more to Doug who loaded the latest drivers, and got the HTPC ready for Prime Time.


And thanks to Mark Haflich for the use of his home theater for the testing.


We looked at the HTPC with the 5950, but first we had the pleasure of watching Marks processor, so it was a good starting point. We also hooked up the MP-5.


As expected, Mark's processor was a non issue. The 5950 was simply excellent, and that was without it being scoped. We forgot to scope it before we left to go to Mark's. We did get to scope the card once we got back to my house. We found that the most important tweak for this card is proper levels set by a scope, once that's done, it puts the card in class of its own.


We also got to look at Mark's HD Sat box before the MP-5, which outputs RGBHV. Then we watched U571 from a JVC Deck to the MP-5. By comparison the sat box was anemic in black levels.


The 5950 is the best video card that I've seen so far, but in order to see it at its best, IT MUST BE CALIBRATED. Finally, a video card that renders PERFECT blacks and is so clean, it looked better than film.


Ten Thumbs Up!


----------



## jcmccorm

Thanks Mike!


For those of us with a scope, and a 5700Ultra card, how do you adjust the individual levels for RGB? Do you just change the 75ohm termination resistors on the card to something just above or below 75ohm? Or are you making the adjustments on the MP-1?


I've been thinking of moving the MP-1 from my ATI9500 to the NVIDIA 5700 and giving it a whirl. It would indeed be a bummer if the 5700 outputs are off (my gray-scaled Marquee would be off and I wouldn't match my HD source any more) 


Cary


----------



## Chuchuf

I have been running the 5700 card for about 6 weeks now and since the latest release of drivers from NVidia a few weeks ago, this card has really cleaned up on the video side. No more stuttering using TheaterTek.


Terry


----------



## mp20748

Quote:

_Originally posted by jcmccorm_
*Thanks Mike!


For those of us with a scope, and a 5700Ultra card, how do you adjust the individual levels for RGB? Do you just change the 75ohm termination resistors on the card to something just above or below 75ohm? Or are you making the adjustments on the MP-1?


Cary*
I don't know of a way to adjust the levels within the card. But the three levels from the ATI's and the Nvidia's are slightly different. This I would expect to compensate for on the projector. I've not changed the resistors on the board, though they are not all (RGB) 75 ohm in DC resistance. Why? I have no idea, but for some reason they are not equal in resistance.


Terry, is right about the improvement in the drivers, and if you've already downloaded the same driver numbers, still download them again.


----------



## benny

Quote:

_Originally posted by mp20748_
*The 5950 is the best video card that I've seen so far, but in order to see it at its best, IT MUST BE CALIBRATED. Finally, a video card that renders PERFECT blacks and is so clean, it looked better than film.


Ten Thumbs Up!*
*Crash ... Bang ... Tinkle !*


That's the sound of 1000's of Radeon based cards simultaneously hitting the rubbish bins as the videophile owners rush out the door to go buy nVidia's offering 


Mike ... could you describe more fully the improvements I would see moving from my Radeon 9600 Pro ( MP-1 fitted of course! ) to the new NVidia product with your mod? The MP-1 made a noticeable improvement to the stock 9600 card, particularly in the area of shadow detail rendition and video noise reduction, and from what I've just been reading there is substantial improvements again.


Cheers and thanks for a fine product


Russ


----------



## stylinlp

MP, Are you saying that a MP modded 5950 with a AMD 1800 machine is better image quality than a full blown PEntium4 2.8c w/ 800fsb motherboard, 1 gig ram, Radeon 9600 with full Resize usign ffdshow?

Both using Theater Tek on a Marquee Electromhome 8500 projector.


See, the whole point of using ffdshow Resize is to do a better job of Resizing than ANY videocard is capable of. A video card uses the worse resize algorithm possible. With Ffdshow you can use BiCubic or Lancos which is the best possible Resizing known.


----------



## mp20748

Quote:

_Originally posted by benny_

*


Mike ... could you describe more fully the improvements I would see moving from my Radeon 9600 Pro ( MP-1 fitted of course! ) to the new NVidia product with your mod? The MP-1 made a noticeable improvement to the stock 9600 card, particularly in the area of shadow detail rendition and video noise reduction, and from what I've just been reading there is substantial improvements again.


Cheers and thanks for a fine product


Russ*
*
That part we'll do next. We still have to get to Deniz's to compare it to the 9800Pro/Mp-1.


So far, it does video very well. However, it may need a little tweaking to get it to its best. But for now, it looks exceptionally clean, mainly because it provides a perfect foundation for both colors and shadow detail, therefore providing a smooth clean film look. We're going back to Mark's next Sunday, and from there to Deniz's.


Stylinlp,

I'm using it with a Celeron 2.8 Ghz/256 ram. We're also running installed HD movie files on the hardrive, and they're working just fine. I'm not familiar wit FFdsho, so that I can't comment on.*


----------



## stylinlp

Really? hmmm

Mike with all due respect I really think you should find a way to become familiar with ffdshow. It seems that all the serious HTPC users use ffdshow. I know that a few years ago some of you "crt projector" crowd didn't think much of ffdshow but things have come a long way since then. Back then the fastest cpu was the AMD 2400 hehe.

I remember a few comments on denoise filter as being artificial and false. Ffdshow users don't even use that filter anymore. Its Denoise3D now 

A few years ago most poeple couldn't do 1440x960 resize in BiCubic. Now that Pent4 2.8c chips are cheaper its more feasable. Also, within Resizing ffdshow can sharpen the Chroma and Lumenance seperately. Thats incrediable. Just use these filters with Resize carefully and you can get a Native DVD to look just as good as HDTV.

Then imagine how this would look with MP VIM mod! 


Question I have is, if using ffdshow and its resize, would the quality DAC's in the videocard effect image quality. Since its ffdshow thats doing the resizing rather than the videocard.

Some of us have older Radeon 9000 cards with its 10bit DAC's. We need to decide if upgrading to 12bit DAC's that come with the newer cards would help image quality using Ffdshow's Resize.


----------



## mp20748

Bblue has and uses ffdshow, and he also has the 5700. Hopefully he'll chime in, he's also an elitist who has been a big help to me on some of my projects.


What projector are you using?


----------



## stylinlp

Theater Tek on a Marquee Electromhome 8500 projector.


Went from Curts Barco 800 to this Marquee 8500 bought from DraganM in Denver. I live in the same city as Tim so might drop by next week and check out that VIM 


I have a feeling that even thou ffdshow might be handling all the Resizing that the image quality is still effected by the videocard. Scoping a videocard for noise and how clean it is must effect any signal coming out of that videocard. Resize or not. Just not sure if the DAC's effect it. Probably just how clean the card is in general.


----------



## mp20748

Quote:

_Originally posted by stylinlp_
*Theater Tek on a Marquee Electromhome 8500 projector.


Went from Curts Barco 800 to this Marquee 8500 bought from DraganM in Denver. I live in the same city as Tim so might drop by next week and check out that VIM 

*
Good move from the 800... now how can we involve you in this testing, since you live near Tim, can we send the card to you to play with?


That would be a great help to me.


----------



## stylinlp

Might not be a good idea anytime soon since all I have is a AMD 1000, with 256megs ram, the motherboard is 4x agp port. I do have that new Antec Overature HTPC case with its True 380watt power supply. hmm, A super power videocard would improve my HTPC but how would I know without scopes and other measuring devices besides my eyes. Im sure Tim would be a better judge at looking at the image quality differance. If he wanted to come by and do my stigmator adjustments and see the before and after of that videocard *grin*.

Funny, I just emailed him yesterday asking if he could come by and do some fine adjustments that I dont have much experience with like the stig and other stuff. I got the projector looking as good as "I" can with my limited projector setup experience. But never touched the stig, etc. Im sure I have banding.

Another thing, my AMD 1000 can only use ffdshow for Denoise3D. No resizing. My buddy has a AMD 2000 and we tried resize at 1440x960. Got like 5 fps lol. But those frames looked sooo clean. Noise free. Well, more noise free than before....

I was thinking of buying a Pent4 2.8c MB combo deal at Fry's Electronics for $200 but just spent $500 on replacing all my brakes on the van this weekend  So that will have to wait a few weeks.


----------



## hdtv_lover

I'd also like to know how the card does BEFORE any MP mods, as compared to after. Also, what mods are needed and how do I get them accomplished (within a small budget?)


Also, who is the manufacturer of the card you're using, Mike?


Thanks for the great work!


----------



## Vern Dias

Mike, still waiting......


And... for those that maintain that ffdshow is the only way to achieve a high quality image, I would like to point out that since ffdshow resize is user ugly in a constant height variable AR setup, I have always depended on the video card for the scaling.


My desktop is 720x1440, the projector applies a slight horizontal stretch to make the image AR 2.40:1 to match the screen. Zoom player and the video card do the resizing duties to handle everything from 1.33:1 to 2.55:1 (two modes, one crops the sides slightly, the other delivers a full width image with a 3" height reduction.


FFDSHOW runs DScaler sharpen filter, and can correct for poorly transferred DVD's by modifying gamma, black level, etc. on a per DVD basis that is then stored by ZP to be applied whenever the DVD is played.


My 5950 Ultra and my previous Radeon 9800 Pro do the scaling with no visible issues. (VMR9)


Vern


----------



## stylinlp

Vern. I have had the vertical squeeze mod done to my Marquee 8500. With a 1.85:1 screen my horizonal adjustment is 92 and vertical adjustment is around 10 in the projector. 90x36" Gatorfoam board.


I use Powerstrip to get my [email protected] which is 2x native DVD resolution.


Theater Tek does the rest of the Aspect Ratio adjustments per movie basis.


What we are saying about using ffdshow is that algorithm's used in any videocard is of the poorest quality. Anyone that has any experience as a graphic artists with Photoshop knows that Bicubic and others is much superior way of resizing an image.


I beleive what MP is measuring with his scopes in a videocard is how clean the actuall video signal is coming out of that card. Not how that card resizes an image.


----------



## thirdkind

I use an MSI 5700 with Zoom Player, WinDVD filters (superior to Sonic/TT imho), and VMR9. I had never tried ffdshow resize before, but gave it a shot this weekend. Unfortunately my P4 2.6 (OC'd to 2.85GHz) and 5700 aren't quite enough to get the job done, even when only resizing to 1280x720. I get minor tearing and occasional video lockups when using bicubic scaling. Lanczos and spline are even more unstable.


The bilinear scaling offered by VMR9 just can't compete with ffdshow bicubic; Lanczos or spline resize are even better. They enhance detail (which VMR9 definitely needs, being very soft and natural when compared to overlay) but don't add noise or nasty ringing.


Perhaps later this year I'll upgrade to a 6800 and a high-end P4 (dual processors perhaps?). VMR9 is still pretty nice compared to the overlay though, even with its inferior scaling. Definitely the way to go.


----------



## Mark_A_W

Jay, I think the tearing issue with VMR9 is not as simple as grunt. With your system I would've thought you have plenty of power.


It's more than grunt. I think it's a more fundamental problem with VMR9. I'm disappointed a FX5700 doesn't fix it.


I get the tearing problem with my XP2400 at 2600 and my now ancient radeon 9000pro. The actual resize to 1280x720 is no problem with lanzcos.


I might give Vern's dscaler sharpen method another go, but I like resize better. I'll play around and see if the tearing olny occurs with resize.


----------



## crumbaker

Well one thing I have yet to see being pointed out is that you're comparing a current gen card (5700) to a card thats 2 architecture generations behind (9500).


I'm far more of a computer nerd then an ht enthusiast, and most of us will tell you ati has traditionally had a better image than Nvidia. Sometimes not always faster but usually higher image quality.


However you may be correct and the different manufacturers do make a difference.


But my point is your comparing a mid range card from 2 generations ago to a high end card from this generation.


----------



## thirdkind

Quote:

_Originally posted by Mark_A_W_
*Jay, I think the tearing issue with VMR9 is not as simple as grunt. With your system I would've thought you have plenty of power.


It's more than grunt. I think it's a more fundamental problem with VMR9. I'm disappointed a FX5700 doesn't fix it.


I get the tearing problem with my XP2400 at 2600 and my now ancient radeon 9000pro. The actual resize to 1280x720 is no problem with lanzcos.


I might give Vern's dscaler sharpen method another go, but I like resize better. I'll play around and see if the tearing olny occurs with resize.*
The tearing only occurs when ffdshow is in the mix. WinDVD filters straight to VMR9 results in glitch-free playback.


Could be my "ancient" motherboard (Northwood P4 with 400MHz FSB, RDRAM, and AGP 4x).


----------



## Budget Pete

Just on CPU upgrades mentioned in this thread, we have found that the newer barton core AMD 2500XP chips all run fine at 3200 just by making the FSB 200MHz. Even with the stock CPU fan they are as solid as a rock - A cheap(er) way to get to be able to use ffdshow etc. etc. Just make sure your RAM is also happy at 200MHz FSB.


----------



## mp20748

Quote:

_Originally posted by hdtv_lover_
*I'd also like to know how the card does BEFORE any MP mods, as compared to after. Also, what mods are needed and how do I get them accomplished (within a small budget?)


Also, who is the manufacturer of the card you're using, Mike?


Thanks for the great work!*
I would think this card looks good without the MP-1, but I'm not sure, I did not think to check before the mod.


We're working on upgrading our website, and once that's completed, there'll be a link to someone who'll offer the service of attaching the mods.


Vern,

I forgot to get back to inform you of the changes that were made in the proposed testing. Some things were changed, so I now own a 5950 instead of the 5700. I've modded the 5950 with the MP-1. It will be used for testing, but I'm hoping to use the 5950 in comparison with another 5950. The 5950 prety much uses the same CPU as the 5700, but there are some other features of the 5950 that makes it a little cleaner over the 5700. The difference in that cleaness may not be notice with DVD playback, I'm hoping that it'll make somewhat of a difference with the HD files. We'll see once I get time to get back to it, but once I do, I'd like to send it out for testing as well.


---


"and most of us will tell you ati has traditionally had a better image than Nvidia."


Carl Lewis used to be the fastest human in the world.


----------



## Vern Dias

NP, Mike. The 5950 Ultra is what I have been using all along. I have two identical systems, one has the ATI 9800 in it, the other has the 5950, so I can AB test very easily by swapping out the ATI.


Vern


----------



## mp20748

Bingo!


----------



## thirdkind

Mark,


I can't thank you enough for the DMO_Abstract tip in my topic in the HTPC forum. I searched the HTPC forum and found this post detailing not only DMO_Abstract, but the registry tweaks necessary to get the WinDVD filters to cooperate with ffdshow.


I'm now successfully using Lanczos resize at 1440x960 with absolutely no tearing or dropped frames whatsoever  Spline works at 1280x720 but occasionally drops some frames at 1440x960. I don't really see a difference between 1280x720 and 1440x960, so I'll stick with 1280x720 and save the CPU cycles for experimentation with ffdshow's other features.


It appears a P4 2.6Ghz (overclocked to 2.85Ghz) and FX5700 non-Ultra at stock clock timings are perfectly capable of running high Lanczos resize resolutions in VMR9 with no visible issues.


I'm using the leaked 61.12 drivers downloaded from guru3d.com.


----------



## mp20748

Vern,

we'll have to get back to this next week. There's no question in my mind that the 5700/5950 cards are superior to the ATI's. I say that concerning the DAC's analog performance only. I don't get into anything else on these cards. Still I have no clue as to what FFDSHOW is, and I have no interest in what it does. I'm from the old school, I don't believe in adding anything to the signal. I want it as pure as possible, and that is what my test are based on.


With the exception of that FFDSHOW thing, I like your approach on evaluating the video cards. DMW is a must. It is very important to do a pre and post software player test using specialized software.


For now I'll need some help with the testing until I'm finished with some other issues.


----------



## thirdkind

Mike,


ffdshow is basically a series of highly configurable video processing filters. Its capabilities include different scaling algorithms, sharpening techniques, deinterlacing, noise reduction, etc. Just about anything you can think of.


If abused, ffdshow will ruin the image, just as cranking the sharpness control on your display would. I haven't tried the sharpening tools yet--and probably never will--because I find high frequency noise that simulates increased detail to be objectionable. I prefer to see what's there rather than trying to trick my eyes into believing DVD has more to offer than it really does.


However, the scaling algorithms are top-notch--far superior to the scaling your video card is able to perform. It's worth incorporating ffdshow into the mix just for the advanced scaling algorithms it provides. Using ffdshow for scaling isn't doing anything your video card isn't already doing itself; ffdshow is just capable of doing it much, much better.


I rarely jump up and shout about the immediate and noticeable improvements brought about by changes in my system. Most are too subtle for my eyes to notice. But even my eyes can tell bad scaling from good scaling, and ffdshow steps all over the scaling performed by the 5700 (or any video card for that matter).


----------



## bblue

Sorry, I've been a little behind on Forum stuff.


I have to agree with others using ffdshow on their HTPC. There's just no way to get optimum performance with any video card without it. I would almost question the outcome of testing various video tweaks without it in place -- you might as well be watching a good DVD player.


I'm also a little curious about Vern's preference of VMR instead of Overlay. It has been my consistent experience (and those of many others in the ffdshow thread in the HTPC Forum) that VMR is simply not as defined or sharp as Overlay. The Overlay method when used with ffdshow's bicubic or lanczos resize is much cleaner and detailed to me. When I was using the ATI 9800 Pro/MP-1 combination I preferred lanczos resize, but when I changed video card to the Parhelia which by itself is more defined, I noticed the lanczos resizing was muddying up edge detail as well as detail surrounding small objects. Switching to bicubic resize took care of that and this has held true with the NVidia 5700 Ultra.


I used to also use Dscaler's sharpening as an after-filter to the resize, set at low numbers in the 30-50 range. On the ATI and lanczos resize this seemed like an excellent improvement, but since the 5700 Ultra I'm finding that it seems to add an artificial sense of sharpness that really detracts from real detail -- at any number setting. So other than a picture adjustments filter (black,white,gamma levels) ahead of the resize, that's all I'm using now. If someone has another total combination configuration to try, let me know -- I'd love to check it out.


Here are my current ffdshow settings in order and by tab name:


codecs: raw video = YUV All

levels: in:0 - 235 out:16 - 255 gamma:1.18

resize:1440x960 resize always, no aspect ratio management

settings: (under resize) bicubic, parameter default, luma 1.80, chroma 1.70

output: YV12 (only)


I'm a little iffy on the relationship between 'levels' settings and the 'output' setting of YV12. It seems to take on the best black to white balance in this configuration.


Up to now I have not had an MP-1 on the Ultra 5700 (or Parhelia), but am adding one tonight and see just what changes. Both of those cards can drive a cable very well, but there's always room for improvement!


--Bill


----------



## Nich

I have a 5700, maybe I should give FFDSHOW a go with your settings....


Nicholas


----------



## Budget Pete

Bill, what do you think of the 9800 with MP1 vs the 5700 without?


I haven't investigated ffdshow yet, can you use it with theatertek, or is it a zoomplayer only thing?


----------



## bblue

Pete,

I responded to that in a different thread http://www.avsforum.com/avs-vb/showt...97#post3879797 .


Yes, ffdshow will work with Theatertek.


But that reminds me that I neglected to include an important aspect of my ffdshow configuration. I'm using it as a filter in Zoom Player Pro, and specifically with the WinDVD 5 video filter. None of the others (Sonic, PowerDVD, NVDVD 2, etc) compare in this configuration. The NVDVD 3 filter is supposed to be really super but I haven't gotten it yet (it's in beta).


Even if the other filters don't compare, ffdshow as described will be an improvement over not having it.


--Bill


----------



## CaspianM

The nVIDIA overlay setting offers a sharpness control which worked well without typical sharpness artifacts. I have been using it with good results.


----------



## thirdkind

Quote:

_Originally posted by bblue_
*But that reminds me that I neglected to include an important aspect of my ffdshow configuration. I'm using it as a filter in Zoom Player Pro, and specifically with the WinDVD 5 video filter. None of the others (Sonic, PowerDVD, NVDVD 2, etc) compare in this configuration. The NVDVD 3 filter is supposed to be really super but I haven't gotten it yet (it's in beta).


Even if the other filters don't compare, ffdshow as described will be an improvement over not having it.


--Bill*
I'm using the same configuration (ZP, WinDVD, ffdshow), and it's easily the best DVD playback I've seen on a PC. I know people love TheaterTek because of the interface, and I can see how hacking the registry to get WinDVD to cooperate with ZP would turn a lot of people off, but the Sonic filters can't compete with the WinDVD filters. I've tried all the same filters as you did and came to the same conclusion. Only the Elecard filters are comparable, but I can't get them working properly with ZP (weird aspect ratio issues).


I have NVDVD 3.0 (the package mistakenly posted by Viewsonic on their web site a while back), but can't get the filters to work with ZP. I tried Forceware 3.0 for DVD playback, but something isn't right with that either because the image quality is horrible and I can't get DD/DTS passthrough working with my RME Digi96/8 card.


As for VMR9, a lot of us prefer it because it doesn't have the artifical edginess of the overlay. My first viewing of VMR9 was with Open Range, and while my initial impression was that it was too soft, an hour or so into the film I realized how relaxing, natural, and pleasing the image was. After using it for a couple weeks and then switching back to the overlay, I was sickened by the grating sharpness of it.


The primary drawback to VMR9 is its inferior bilinear scaling, which is a limitation of the graphics hardware. ffdshow's superior scaling algorithms and VMR9's clean rendering make the perfect combination.


Last night I watched Bubba Ho-Tep (amazing, weird little film), and as my first full-length feature with ffdshow performing the scaling, I was very satisfied. Some very noticeable ringing though. I'm going to unplug ffdshow tonight and skip around the movie using just VMR9 and see if the ringing is in the transfer or a byproduct of the Lanczos scaling.


----------



## Budget Pete

Wow!

Quote:

__________________________________________________

The 5700U without an MP-1 is sharper, has better detail, truer color and no artifacts that I've noticed yet.

___________________________________________________

Thanks Bill.

So a 5700U by itself looks better than an MP-1 9800 combo!? That is really great (Sorry about missing your answer in the other thread)

That makes a really affordable upgrade for those of us sitting on a stock 9000/9200 etc.

Can't wait to hear what difference the MP-1 added to the 5700 will make.


----------



## CaspianM

Quote:

_Originally posted by thirdkind_
*As for VMR9, a lot of us prefer it because it doesn't have the artifical edginess of the overlay. My first viewing of VMR9 was with Open Range, and while my initial impression was that it was too soft, an hour or so into the film I realized how relaxing, natural, and pleasing the image was. After using it for a couple weeks and then switching back to the overlay, I was sickened by the grating sharpness of it.

*
The overlay still offers a moderate solution without any edginess, ringing in particular, for those who have not opted for VMR9. In fact at 25% I did not notice any ringing to be significantly objectionable. This is a sharp video card by itself and just don't know how much of perceived sharpness one needs to have for DVD playback. Most of the time I prefer no sharpness at all for that matter. It looks good as is IMO. Then again, I am old fashion and like simplicity and forwardness in my setup.


----------



## stylinlp

Budget Pete I want to know that one too. Can't wait to get rid of this Radeon 9200se.


----------



## shatten22

thirdkind-


what do you mean by "ringing" in Bubba-Hotep? I noticed sound distortion problems whenever Elvis yelled, especially when they finally faced Bubba at night in the yard. Did you have this problem also?


thanks


geoff


----------



## bblue

Jay,

I don't recall all the details at the moment but the ringing overlay issue (which was real btw) was resolved many months ago. What I don't recall is how, but I believe it was in DirectX 9.0b. I've seen no evidence of it for quite some time, though if ffdshow is used incorrectly with too much or the wrong type of sharpening, you will see ringing in overlay mode that you don't see in vmr9. That's only because vmr9 is soft and hiding it.


The whole ringing issue is compounded by different types of ringing on the DVD themselves. Some filtering will hide it or emphasize it depending on the DVD. Personally, I try to keep additional filtering to a minimum and go for the most neutral and cleanest display. Then it's very clear where the artifacts that you see from DVD to DVD are actually occurring.


Any overlay 'sharpening' options there might be in a card driver are certainly artificial enhancement and shouldn't be used.


On the WinDVD 5 decoder, I'm not sure why you would need to change any registry settings. Do you know what needed changing? The only thing you need to do in ZP Pro is add the DMO_abstract filter set to 0 (ahead of the ffdshow filter). I do that by running the WinDVD program once and setting abstract to 0 then exit. Perhaps the registry setting is a way to set abstract if you don't have the WinDVD program?


--Bill


----------



## bblue

I'm very pleased with my addition of the MP-1 to the 5700U card. It's not a major jump-out-at-you change, but is significant.


On my system with a 35' RG6 cable run between the PJ and HTPC, the first thing I noticed was that the picture seemed to be 'bolder', almost like there was more contrast, but no adjustments had been changed and the MP-1 was set for unity gain (no jumpers). There was more distinction between objects in the same range of light and coloring. Everything was just clearer.


Looking at test bars, edges were sharper with less bleed. Single pixels by themselves or colored on top of a differently shaded solid color surface were more distinct.


The only down side that I can see with this on DVD source is that problems with the DVD are more obvious, as are artifacts of subsequent filtering or sharpening. To me this is really an up side because it makes tuning filters easier, but those who prefer the softening of some filters or vmr9 may have to make some changes.


On the other hand, my wife saw no change at all ... so there you go.


--Bill


----------



## mark haflich

Bill. As a retired engineer and lawyer (but still very active retailer), I think I am competent to comment on your observational skills. Alas, you have gotten so involved with improving the art of CRT display etc, you have lost the ability to see the forest through all the pruning you have done to those trees. There is only one way to eliminate all the problems with DVDs. Duh. Don't watch em. I have a simple solution.


Take today for example. Top down on the Miata, nice drive from MD to northern NJ to stay overnight with an old college friend and his wife. Then off to Belmont racetrack to see the Belmont. We will be at the track from 10:30 AM (first race at noon), to the last race at 8 PM. Then a nice dinner, a quick snooze, and then back to MD by noon to spend a nice quiet Sunday afternoon with Mike Parker and friends playing with my 9500LC and a variety of HTPC cards and mods. Unfortunately, this will indeed involve watching some DVDs but maybe I can leave the room only returning for test patterns. The trick is to get rid of everybody in time for the Laker's game at 8PM or so. Now video doesn't have those nasty problems associated with DVDs. The game is in HD, over the air in 1080i I believe. I can't ***** about deinterlacing artifacts or MPEQ compression. I'll just enjoy the game. No HTPC, no processor, just watch the game in 1080i. And my wife doesn't like basketball, but if she comes down to the HT for a few minutes she will indeed comment on how great the picture looks and how lucky we are to have such a nice 12 seat HT. She runs all sorts of movie parties for our friends down there. Yep. I start everything up and then leave the room if we are showing a DVD (OK not because of all those nasties but because I have seen the DVD before). The point here is for a fresh event, the HT is great. Once I have seen the event, my focus goes to spotting flaws. DVDs drive me crazy. I'd rather do something else with my life and wife. Like feeding MP and his and my friends. Trust me MP is one very picky eater. Have a nice weekend Bill. Hell I won't even check AVS until Sunday afternoon when Parker puts on the 5th element for the 10,000 time.


----------



## thirdkind

Quote:

_Originally posted by shatten22_
*what do you mean by "ringing" in Bubba-Hotep? I noticed sound distortion problems whenever Elvis yelled, especially when they finally faced Bubba at night in the yard. Did you have this problem also?*
I'm referring to ringing in the image and not soundtrack distortion. Many of the edges in the image had a distinctive line around them. Look at the far shots of the retirement home with the hearse parked out front. If you look at the hearse, particularly around the curved decorative metal on the side, you should see some very distinctive outlines--if it's in the source, that is. As I said, I need to watch it again tonight without ffdshow running to see if the Lanczos scaling is adding the ringing or if it's in the source.


I didn't notice any dialog issues at all. As a matter of fact, I was impressed with the clarity of the dialog. I'll check the final scenes again though.

Quote:

_Originally posted by bblue_
*Jay,

I don't recall all the details at the moment but the ringing overlay issue (which was real btw) was resolved many months ago. What I don't recall is how, but I believe it was in DirectX 9.0b. I've seen no evidence of it for quite some time, though if ffdshow is used incorrectly with too much or the wrong type of sharpening, you will see ringing in overlay mode that you don't see in vmr9. That's only because vmr9 is soft and hiding it.


The whole ringing issue is compounded by different types of ringing on the DVD themselves. Some filtering will hide it or emphasize it depending on the DVD. Personally, I try to keep additional filtering to a minimum and go for the most neutral and cleanest display. Then it's very clear where the artifacts that you see from DVD to DVD are actually occurring.


Any overlay 'sharpening' options there might be in a card driver are certainly artificial enhancement and shouldn't be used.*
I think the only thing that makes VMR9 "soft" is its use of the inferior bilinear scaling offered by the video card. When using ffdshow bicubic or Lanczos resize in concert with VMR9, the video card no longer performs any scaling (unless you feed it something higher than desktop resolution, in which case it scales down). VMR9 simply displays what it's being fed, so it results in a very clean image.


I don't use any sharpening filters in ffdshow. I don't use any of the other processing at all actually, just resize, which is noticeably superior to my 5700's scaling. I don't even adjust the Lanczos parameters (chroma/luma sharpen, etc.). I leave them at default because the only thing I'm interested in is the superior scaling algorithm.


It should be noted that I'm using a Sharp 12K at the moment, so perhaps VMR9 is too soft in CRT land, but just right for a digital projector.

Quote:

*On the WinDVD 5 decoder, I'm not sure why you would need to change any registry settings. Do you know what needed changing? The only thing you need to do in ZP Pro is add the DMO_abstract filter set to 0 (ahead of the ffdshow filter). I do that by running the WinDVD program once and setting abstract to 0 then exit. Perhaps the registry setting is a way to set abstract if you don't have the WinDVD program?


--Bill*
You're probably right. The thread I found in the HTPC forum suggested registry changes to get the abstract filter working, but those registry changes probably do the same thing you're talking about.


----------



## Archos

OT - can you use the MP-1 with composite sync in Powerstrip only using four cables, RGBS?


Come on, someone must know! I have four cables and have ordered a MP-1, do I need a fifth?


----------



## audiman

Is there a difference between a 5700 and a 5950 for HTPC purpose ?


----------



## bblue

Jay,

I'll give VMR9 another try. My last few tests were still with the ATI 9800 Pro card, but did use the same ffdshow filtering settings and it should have been an apples to apples comparision.


--Bill


----------



## bblue

Well Mark,

I'm not sure what to say! No convertible, don't enjoy horse racing, Mike lives too far away, and I don't care for watching sports either. Also --from experience-- retailing dulls the senses.


BUT I do agree with you about DVD's. If it wasn't for constant doses of HDnet, Discovery HD Theatre, Bravo HD and a couple of others including the major networks this would be a very dull and uninspiring hobby. With those as a reference, though, you have a mental guideline of what the picture should look like and don't get stuck in the rut of comparing various DVD's to each other to determine whether a change is an improvement or not.


Fortunately for me I can (mostly, anyway) turn down my critical eye in order to watch a DVD movie and still throroughly enjoy it. Some are certainly harder to enjoy than others because of crappy processing, but I'm glad I have them anyway. Even some old Laserdisks captured and processed through Dscaler can be quite enjoyable, too.


I do see the forest, the trees in it, and the trees even if they weren't in it. Maybe I'm just lucky?


--Bill


----------



## stylinlp

I hear you audoman. With all these posts flying all over the place in this thread and others its tough to remermber which is fact or opinion.

Im in the same boat. I have a Radeon 9200 and trying to decide what to do.

I know this must have been answered before but here it does:


"Doesn't nVidia drivers bite?" Don't they have problems compared to Radeon when it comes to DVD movies and resizing to 1440x960?

DId I read recently on here that the newest drivers from nVidia resolve that issue? Is it true?


"5700 Ultra or 5950?" Most of use can't afford a $400 videocard just for DVD movie playback. Is the 5700 Ultra just as clean as the 5950?


Thanks for all your patience for those that have answered these before.


----------



## malefactor

sylinlp, you bring a good point; many of us look at this from the perspective of "What is the best available?", "Give it to me now no matter the price.", "I said now!". I certainly am guilty of this. For me, $500 for a new, cool, video card that is clearer is worth every penny.


Maybe with some patience we'll get harder facts from Mike Parker et all wrt this.


----------



## Budget Pete

I figure, if Bill reckons the 5700U is better than a Radeon Plus an MP-1, and that the MP-1 then added to the 5700U makes a small incremental improvement, then I am chucking my Radeon and getting a 5700U without MP-1.

If his wife can see no difference between the modded and unmodded 5700, then that is good enough for me 


If you have had your projector professionally calibrated, and have a totally dark room, and have lots of cash then getthe best, but it sounds like a stock 5700U is so close to as good as it gets, that it is the sweet spot. And for once it would allow you to play games if you are so inclined.


----------



## Vern Dias

Quote:

I'm also a little curious about Vern's preference of VMR instead of Overlay
Two reasons:


1: It emphasizes any EE or ringing present in the source


2: As an experiment, try using ZP to move the edge of the image (for example a black bar on 2.40:1 DVD) past the edge of the desk top either on the top/bottom or the sides and watch the 3 color planes of the overlay move independently of each other. They will only align once every 4 pixel increments. This is a bug that I reported to ATI several years ago. NVidia circumvents the issue by only resizing the overlay on every third or fourth click.


Also, ZP has a setting to eliminate any excessive softness in VMR9 when scaling the image.


Anyhow, it's all about personal preference and the use of a really great Video Directshow filter set (NVIDIA FWMM and postproc) in conjuncton with the judicious use of Dscaler sharpen through ffdshow.


And, since I can clearly see film the grain on almost every film sourced DVD, I doubt that I have any issues resolving detail due to the use of VMR9.


Vern


----------



## bblue

Vern,

If you don't mind, would you detail more specifically how your ZP Pro and ffdshow is configured wrt particular filters and settings relevant to them?


Also, I'm not sure I understand quite how overlay could single out and emphasize EE or ringing present on the DVD. From what I've seen in previous tests (not within, say, the last three months) the character of ringing with either display method is the same, and since overlay shows more detail, it naturally would show more ringing -- but not disproportionately so. My tests were on specific hardware and software configurations so there could have been other variables at play.


--Bill


----------



## mp20748

Quote:

_Originally posted by Budget Pete_
*


I figure, if Bill reckons the 5700U is better than a Radeon Plus an MP-1, and that the MP-1 then added to the 5700U makes a small incremental improvement, then I am chucking my Radeon and getting a 5700U without MP-1.

If his wife can see no difference between the modded and unmodded 5700, then that is good enough for me 

*
I notice that you're not in the US, but for some reason you seem to report things the same way the US media does.


This is Bill's quote:


"I'm very pleased with my addition of the MP-1 to the 5700U card. It's not a major jump-out-at-you change, but is significant.


On my system with a 35' RG6 cable run between the PJ and HTPC, the first thing I noticed was that the picture seemed to be 'bolder', almost like there was more contrast, but no adjustments had been changed and the MP-1 was set for unity gain (no jumpers). There was more distinction between objects in the same range of light and coloring. Everything was just clearer.


Looking at test bars, edges were sharper with less bleed. Single pixels by themselves or colored on top of a differently shaded solid color surface were more distinct.


The only down side that I can see with this on DVD source is that problems with the DVD are more obvious, as are artifacts of subsequent filtering or sharpening. To me this is really an up side because it makes tuning filters easier, but those who prefer the softening of some filters or vmr9 may have to make some changes."


I also interpret this as being a "significant" improvement, but of course, I fully understand what is being said here. How did you get "a small incremental" from the above?


In my sound system I have a vacuum tube preamp, that does not have tone controls. When I was having guest over, they could never understand why I would want a preamp that did not have bass, treble and other tone gadgets. I built the preamp myself. To this day, two of my brothers cannot understand a sound system that does not have tone controls, nor why I did not bother to explain the difference...


Some things are just not for everybody.


----------



## stylinlp

Thanks for pointing that out MP.


I see and understand Bill did see an improvment with the MP mod with his 35 foot of cabling.


He mentioned that the image was so good that he saw the problems with some DVD's not being up to par. ffdshow would fix that 


In case anyone is interested. My buddy had a bbq party today and I brought over the newest version of ffdshow for his HTPC and Marquee 8500. Since he only has an AMD2000 over clocked to AMD2200 I just enabled the Denoise3D filter. Whew what a differance. All those wiggling maggots in the back ground of static scenes vanished. Hes a happy camper now. I then proceded to tell him about the MP VIM boardn as the next step.


----------



## Vern Dias

FFDShow is being used solely in the DScaler sharpen mode, running at anywhere from 32 to 192, depending on the DVD being viewed.


Since I have a constant height screen, using FFDshow to scale the image would be difficult, since each differant AR would require a different set of scaling parameters.


ZP does all the AR adjustments using VMR9, as overlay has serious issues with the alignment of the chroma planes when the image is scaled beyond the edge of the desktop.


I am in the process of tuning up a new environment using a Sharp 12K fed by DVI from the 5900 Ultra , and I am certainly impressed with the reduction in visible EE with this new environment.


Vern


----------



## pcgeek

Quote:

_Originally posted by Vern Dias_
*FFDShow is being used solely in the DScaler sharpen mode, running at anywhere from 32 to 192, depending on the DVD being viewed.


Since I have a constant height screen, using FFDshow to scale the image would be difficult, since each differant AR would require a different set of scaling parameters.


ZP does all the AR adjustments using VMR9, as overlay has serious issues with the alignment of the chroma planes when the image is scaled beyond the edge of the desktop.


Vern*
Not sure if the benefits of scaling above the target resolution are real or not but I'm also running a constant height screen and then I optimized all of the resolutions and scaling for 2.35:1 movies (1280x544 screen, scaling to 1280x720 or 1440x960 if the CPU can handle it). It dooesn't result in perect scaling for the other AR's but it still looks a LOT better than just letting the video card scale. This way the video card is always scaling down as well..


----------



## dokworm

Well, I think that is a bit unfair on Pete, I think he was commenting on Bill's response here


quote:

--------------------------------------------------------------------------------

Hey Bill, how would you say the stock 5700U compares to a Radeon with MP-1 Fitted?

--------------------------------------------------------------------------------


That's a good question, Pete, and I think the answer would depend on what you're looking for in a card.


My 9800 Pro/MP-1 had a pretty good display, but I was constantly bothered by noise and image artifacts that were actually being produced by the ATI. They were most obvious on HD source, not DVD, but once you recognized them you could seem then on good DVD as well.


The 5700U without an MP-1 is sharper, has better detail, truer color and no artifacts that I've noticed yet. I expect it to perform better yet after I install the MP-1 tonight.


How much you actually notice and appreciate these differences will depend a lot on your projector setup and general viewing conditions. They're both very good cards, but with a well tuned projector and critical eye there are noticeable differences.


A lot of us around here have this disease called iwantitifitsbetter-itis and even small improvements are significant. With bigger improvements we can't sleep at night without them. So it depends on how bad you've got IT.

----------------------------------------------------------------------------------------------


I read that as being pretty clear that the stock 5700 was better than the radeon/MP-1 combo.

He also mentioned that it was better, but really only to those with a really good setup and a critical eye.


On his other post, re the MP-1 then added to the 5700U, I would classify "It's not a major jump-out-at-you change, but is significant" as an incremental change.


If its not 'jump out at you' on a professionally setup projector with one of the forum's most critical eyes looking at it, then I think it will be a very small increase for most of the people on this forum, so I think Pete's 'small incremental improvement' is probably correct for him and the average user.

So I think his comments are justifiable.

Pete is pretty clear that it is just his opinion, and the comment he made about Bills' wife not seeing the difference, I think makes it pretty clear that Pete is happy that his eyes are not as discerning as Bill's.


In my own opinion, I also think it makes sense to buy a 5700U instead of an MP-1 if you currently have a stock radeon card going on what we saw in our testing, and what Bill saw. I think it looks better than the radeon with an MP-1 added, so why bother. As we have both said, this still leaves you free to buy an MP-1 as an add on later when upgradeitis hts, or when the rest of your system has been upgraded to the point where you will see the difference. This also means you are only getting the upgrade added once, rather than added to your radeon, then later removed and re-added to your Nvidia. I just think it makes good sense to get a 5700 first.


I understand as the manufacturer and seller of the card you have a different opinion on some of these issues, and as you said "Some things are just not for everybody."


I am just trying to help make it a little clearer who it is for. I get the feeling (once again, just my opinion) that the MP-1 is often touted as an absolute wonder by some, and worry that people will buy it that won't really benefit from it. (i.e. someone with an uncalibrated projector and room, or with poor source material/equipment, or someone who has a base radeon card and would get a better result buying a 5700 over buying an MP-1 to put on their radeon) and that they would be better off spending their money on something that will get *them* a bigger improvement for their particular setup.


I think you have always been very clear that the MP-1 is not for everyone, and have stressed that the improvements are best seen by a critical eye on a well setup system,I have no argument with your there. Other's have been a bit more, let's say 'evangelical', and that is their personal opinion, and that is fine, but I like to try and sort the fact from the hype as much as I can, and make up my own mind after seeing it for myself.


In my opinion, the MP-1 is a really great product or the right customer.

It is the icing on the cake, and would be one of the last things I would add to my system once everything else was just right, and at that point I would definately do it.

For me now, I get a bigger improvement by replacing my Radeon with a 5700U and so that is what have done and am very happy with it.


----------



## dokworm

On another note, I fitted the MP-1 to the momitsu last night, and at first glance, this looks like a very worthy combination. I will be reporting the results of that setup in the momitsu thread after some more testing.


----------



## bblue

Vern,

What really surprises me about your setup, is that you seem to be now applying dscaler sharpening to the native picture size of 720x540. I don't understand that that can possibly be as effective as your earlier configuration.


Lately I have been trying VMR instead of overlay, and so far the best combination I've found if you have enough CPU is (optionally) d3dnoise set at .5, 1.0 5.0 HQ as the first filter, a Level adjustment of the white input to 239 (to get the right scope range on the output), then a resize to at least your display size, with lanczos luma (only) sharpening at 1.51.


This is with ffdshow input at raw YUV and output at YV12. At 3.2Ghz I don't have enough horsepower to output at RGB32. ZP Pro set for VMR7 windowed, as VMR9 windowed has some odd little jitter on up and down pans.


If I drop out lanczos Luma sharpening altogether, turn off denoise, and add dscaler (keeping resize, but no sharpening) it doesn't look quite as detailed to me here. I would think that if you were applying dscaler on the native video it couldn't help but do less sharpening.


Is there anything else you are leaving out of your description? Which decoder are you using with this combination?


--Bill


----------



## Mark_A_W

According to Blight, VMR7 may be actually using the overlay.


----------



## bblue

That's interesting. It doesn't look like overlay as far as the need to change basic white and black levels to get the output working right, but it doesn't seem quite a sharp (slightly) as VMR9. Unfortunately VMR9 does odd things on my computer, subtle vertical shaking on pans, kind of a top to bottom walking look on certain kinds of move/updates. Nothing I've tried seems to affect it, it's just there.


Overlay sure is easier to work with! The grey scale doesn't seem to work out quite right with it, though. I haven't yet been able to put my finger on exactly what that is.


--Bill


----------



## Bill Gaw2

Have Geforce 5700 and tried to load 1440x960 at 60 Hz and the image became extremely unstable. Works fine with 72 Hz. Anybody lse seen this problem?


Bill


----------



## JeffY

No but I had the problem at 48Hz.


----------



## Tedd

John the Depot Dude and I had similar experiences with his blended setup and dual 1292Qs and FX 3000. Some resolutions wouldn't take and produced error messages. Even Powerstrip occassionally produced an error message. The newer Powerstrip 3.50 seemed to solve the issues with 1440x1080P, with everything fine the next day. We tried a later NVidia driver and had problems so we went back to 57.xx that was acting strange and had zero problems setting up 1440x1080P. I have never seen anything like it and have to admit, it left me wondering if the drivers were corrupt initially.


----------



## Tedd

Skimmed through this thread....


Today we did some A/B testing of a FX3000 against a Radeon 9000 with MP-1 with 5BNC on a blended dual Sony 1292Qs running at 1440x1080P (4:3 so all the crt raster is being used) per projector. (A VERY cool project!) Nothing too elaborate, scientific or even involved as the pcs are testbeds to check out blending of two crt projectors. The interesting part of all this is we could see side by side, the differences between the cards as the dvd's image was spread across the two projectors on a 54x137" 2.35:1 Goo Screen. The Radeon with MP-1 had less noise but the FX card was outperforming the 9000. But it wasn't a run away winner. I'm left thinking Mike Parker and nVidia have done some good work! My MP-1 with 5BNC will be transferred to my FX1000. And I think that will be something special.


Video source is a SDI'ed dual output RP82 feeding the two htpcs with PMS Video SDI capture cards.


The FX3000 running dualhead and blended horizontally (at 26xx X 1080P) had less then desirable picture quality.The dual Radeon 9000s with MP-1s walked all over it.


----------



## mp20748

Quote:

_Originally posted by Tedd_
*


I'm left thinking Mike Parker and nVidia have done some good work! My MP-1 with 5BNC will be transferred to my FX1000. And I think that will be something special.

*
Absolutely ... I can't wait to get back to testing the 5950. That card is one serious video card, and I'm hoping to show just how great it is when married to a MP-1 at the next HT gathering. In fact, I think so much of the 5700/5950 that I've redesigned the MP-1 to be a perfect technical mate for these cards.


I've been very stretched , but it has not all been stressful chores. Today I had a unique opportunity to meet with a former Electrohome/Christie Digital Engineer   . Yep, I left the airport with a grin on my face. After all of my headaches the past year or so, I think this is going to be a very good year


----------



## hdtv_lover

Wonder how the new 6000 series will perform in comparison. nVidia website says it has 225 million transistors or about 8 times as many as pc processors (minus cache) and includes a "video processor." Very interesting info.


But how will it perform in the real world?


----------



## RoBro

Quote:

_Originally posted by Tedd_
*

The FX3000 running dualhead and blended horizontally (at 26xx X 1080P) had less then desirable picture quality.The dual Radeon 9000s with MP-1s walked all over it.*
Tedd,

how can the behaviour of the 3000 be that different displaying dualhead and blended versus single head with dscaler?

Roland


----------



## MC Maniac

Quote:

_Originally posted by RoBro_
*Tedd,

how can the behaviour of the 3000 be that different displaying dualhead and blended versus single head with dscaler?

Roland*
Suspect the technical reason is because we were blowing up 480i up to a resolution of 2880 x 1080 with the 3000 and spreading this across 2 crt's vs 2 pc's where each does 1440 x 1080..


The 3000 looked very good using two computer monitors..but it was different when 2 CRT's are used..


The picture blown up looked washed out - had no punch to it..the video itself was not smooth either..It was disappointing and not something I could live with..


too bad because it means having to go from one PC to a pair of PC's..And I now want to get a pair of 5950's - each with the MP-1..


----------



## RoBro

Would there be a chance not to use the video cards scaler but to scale with FFdShow?

Roland


----------



## Chuchuf

I have now been running the 5700 for about 2 months now and have to say that I am still impressed with it as a card for reasons I stated earlier in this thread.

Over the weekend I had an opportunity to look at a PNY 5700 w/ MP1 mods that Mike did for Andrew, and you could see the improvement in the desktop immediately. Scoped it and you could definately see the difference in the the MP1 has. Can't wait to get one.

We also scoped the VMR9 and overlay settings using the latest NVidia Video decoders (unobtainium right now but soon to be released) but I really haven't had a chance to look at them yet watching DVD's. Initial impressions were good. These are the same video filters which will be used in the new TT2 coming soon.


Terry


----------



## Tedd

The interesting part of seeing the blended 54x137" setup is all the warts are there to see, in real time and we could see each half of the dvd's image as two computers are being used, each running 1440x1080P. The final image is somewhere around 2650 X 1080P resolution, scaled up from the dvd resolution. The one issue I had with the Radeon 9000 was never evident to me on a 54x96" screen.


----------



## JeffY

Quote:

_Originally posted by Chuchuf_
*We also scoped the VMR9 and overlay settings*
Scopes are no good for calibrating graphics cards. All the settings are digital. You have to use test patterns.


----------



## mp20748

Quote:

_Originally posted by JeffY_
*Scopes are no good for calibrating graphics cards. All the settings are digital. You have to use test patterns.*
Actually you'll have to use both. Yes, the settings are done in the digital world, but the end results are analog. And with high performance analog video, the video refeference (source) must be set to the video standards before any adjustments on the display device can be accurate.


So a scope is needed to display the proper IRE levels for both contrast (white level) and brightness (black level) in the video card itself, and proper test patterns must be used for reference material.


The default settings in the cards are not necessarily preset reference levels, and that is why we scope the output of the cards for accurate video levels.


----------



## JeffY

With the overlay, YCrCb is mapped to 0-255 RGB, there is no room to adjust contrast up-wards, this is why Cliffs scoped settings clipped whites. Even with contrast set to 100 you loose some detail during the component to RGB conversion. With VMR YCrCb is mapped to 16-236 RGB, trying to use the video card settings to get VMR to display black at 0mV and white at 700mV is totally pointless, you're in a loosing battle from the beginning.


There is an interesting thread here that you might won't to look at.

http://www.avsforum.com/avs-vb/showt...hreadid=416292


----------



## bblue

Jeff,

There are lots of ways to adjust the video, and many of them are wrong. It sounds like you need to sit down with a scope and see exactly what can be done when levels and expansions are correctly set. You are correct that with certain combinations of hardware, correct output is not possible with just adjustments in the player or video card.


But it can be done correctly with just about any hardware with a filter plug-in called ffdshow and a software DVD player that supports it, such as Zoom Player Pro. In ffdshow, you can define your input colorspace, your output colorspace, the exact expansion, and fine tuning of white and black levels -- even gamma, without any crushing and the correct output levels. It can be used with overlay or VMR and any video card.


You cannot calibrate your video source correctly without a scope! You can only depend on other's settings (and/or manufacturer opinions) which may or may not be right. Your projector or monitor should also be set up correctly, which is crucial for optimum video performance.


--Bill


----------



## JeffY

If the graphics cards peak white output is below 700mV, there is no way you can increase the output without clipping whites and the whole idea of Studio RGB (used with VMR) is that you keep it exactly the way it is and use the display to recalibrate. So black will be around 50mV and white will be around 630mV. Doing lots of manipulation in FFDShow is just going to make things worse since colour space conversions especially when in software are very lossy.


----------



## Chuchuf

Jeffy,


I'm not really sure what you mean by the statement that scopes aren't any good for calibrating these cards.

The output when using RGBHV is analog so of course they are. Selling up brightness, contrast and gamma can be done very accurately when the output is scoped to a test pattern that shows the correct values. This allows you to adjust the overlays in the program very accurately.

In the setup here when we scope the values we are also looking at these changes on a monitor at the same time to see the effect happen. Once the card is completed, I then send the signal into a reference calibrated Sony G90 to see what the final results look like.


Terry


----------



## bblue

Jeffy wrote:
Quote:

If the graphics cards peak white output is below 700mV, ...
then it's time to get a different card. There's lots of them that produce the correct outputs.


In a studio environment (especially) but in a multi-source home-theatre, using the monitor or projector to compensate for poor source is not the way to go. You can do it that way on some projectors with many memories but it still is technically wrong. If you use the projector white level (for example) to make up lost output from the video card you are also amplifying noise on the cables, and depending on the projector, increasing some of its own input noise.


Studio engineers (audio or video) would likely get fired for doing such a thing. You would correct it at the source, or in the video chain (which usually doesn't exist in a home environment) based on scoped readings, not at the monitor.


--Bill


----------



## JeffY

Quote:

_Originally posted by bblue_
*Jeffy wrote:then it's time to get a different card. There's lots of them that produce the correct outputs.


In a studio environment (especially) but in a multi-source home-theatre, using the monitor or projector to compensate for poor source is not the way to go. You can do it that way on some projectors with many memories but it still is technically wrong. If you use the projector white level (for example) to make up lost output from the video card you are also amplifying noise on the cables, and depending on the projector, increasing some of its own input noise.


Studio engineers (audio or video) would likely get fired for doing such a thing. You would correct it at the source, or in the video chain (which usually doesn't exist in a home environment) based on scoped readings, not at the monitor.


--Bill*
This would rule out all Radeons then, you have to increase the contrast to 105 to get reference voltage output. Already at the default contrast of 100 you are clipping detail because of the YCrCb conversion to RGB.


If you read the thread I linked to, Microsoft, Stacey Spears and Guy Kuo are advocating converting full range YCrCb to RGB to keep WTW and BTB and allow much more room for the colour space conversion. The down side of this is non standard voltage outputs. I'm personally not happy with this and I advocate being close to reference but also not loosing detail in the digital domain.


----------



## stylinlp

Ok I'm a little confused. I didn't want to comment on this subject because I wanted to see what you all decided on this topic.


FYI, I have an AA in Video Porduction. Worked for a local NBC station and a fwe cable shows around 1990. So I know the basics of video engineering. Hey I was one of the ones using an Amiga for my graphics!


I know that your suppose to adjust video levels at the source and not on the monitor (projector). I used to take pride in my perfect video levels when creating a tv show while editing. So the source here would be the video card. Also Theater Tek I suppose. In the TT forums they universally aggree to change Brightness to 98. So I've done that. Then setup my Marquee 8500 levels the standard way with the internally generated greyscale. From what I know we shouldnt do it this way. Should adjust the greyscale with the Videocard. All by sight because I don't have a scope.

If I were to do that then I should put the projector brightness and contrast at 50/50 while adjusting the video card? Then maybe fine tune it again using the Projector Birghtness and contrast settings?


----------



## bblue

Quote:

This would rule out all Radeons then, you have to increase the contrast to 105 to get reference voltage output. Already at the default contrast of 100 you are clipping detail because of the YCrCb conversion to RGB.
The fact that you *can* increase the contrast and achieve a higher output from the card means that the card is not the limiting factor. The clipping/crushing you're describing is happening much earlier than the card and can be dealt with properly with ffdshow.


I haven't had an ATI in my machine for awhile now, but was using the same filtering process with ffdshow when I did and had no problems with it. I suppose they could have broken something in the drivers, it wouldn't be the first time.


I did read a great deal of the thread, but not all of it. I saw no references to ffdshow just a whole lot of arguing back and forth with no apparent real solution. Do you know if anyone has actually tried ffdshow for this issue?


--Bill


----------



## techman707

Quote:

_Originally posted by JeffY_
*If the graphics cards peak white output is below 700mV, there is no way you can increase the output without clipping whites and the whole idea of Studio RGB (used with VMR) is that you keep it exactly the way it is and use the display to recalibrate. So black will be around 50mV and white will be around 630mV. Doing lots of manipulation in FFDShow is just going to make things worse since colour space conversions especially when in software are very lossy.*
The display should NEVER be calibrated to the source, because apart from proving nothing, that would be the ONLY monitor that is correct.


If you were to setup of 50mv from blanking and 630mv above that, you would be over peak white of .7Vpp and would probably clip the whites. One of the problems today is all the mixed signals, some using 0 IRE from blanking and some set at 7.5 IRE from blanking. If you don't know what the source is, your output will ALWAYS be wrong.


For the purposes in this forum, the cards output should be properly set with the overlay controls for signal level, pedestal and gamma. If all signals (especially DVD's) were correct, you could set it once and be done, but in the real world, we're stuck with the junk that comes off DVD and (some) broadcast signals.


----------



## bblue

Quote:

If I were to do that then I should put the projector brightness and contrast at 50/50 while adjusting the video card? Then maybe fine tune it again using the Projector Birghtness and contrast settings?
Well, you still have a rubber yardstick. Was the projector accurately calibrated for proper white/black levels with contrast and brightness at 50? If not, it's anyone's guess. If so, maybe, depending on how acute your adjustment skills are on the video card. But as you probably know from your video days, it's so easy to be off be quite a bit and not realize it, even when things appear to be calibrated correctly. If the monitor/projector was right on and you were used to how it 'should' look, you might be able to pull it off.


If you don't have the means to calibrate, then setting information from others and visuals is your only option, unfortunately.


--Bill


----------



## JeffY

Quote:

_Originally posted by techman707_
*The display should NEVER be calibrated to the source, because apart from proving nothing, that would be the ONLY monitor that is correct.
*
*


Tell that to Stacey Spears, Guy Kuo and Micrsoft becuase they disagree.

*
*Quote:*

If you were to setup of 50mv from blanking and 630mv above that, you would be over peak white of .7Vpp and would probably clip the whites. One of the problems today is all the mixed signals, some using 0 IRE from blanking and some set at 7.5 IRE from blanking. If you don't know what the source is, your output will ALWAYS be wrong.
*The thing that clips whites on the video card is going beyond 255,255,255 RGB, nothing exists beyond this, it can't! The clipping is digital NOT analogue.

*
*Quote:*

For the purposes in this forum, the cards output should be properly set with the overlay controls for signal level, pedestal and gamma. If all signals (especially DVD's) were correct, you could set it once and be done, but in the real world, we're stuck with the junk that comes off DVD and (some) broadcast signals.
As long as you don't increase contrast and saturation above 100 becuase that is the maximum before clipping.


----------



## JeffY

Quote:

_Originally posted by bblue_
*The fact that you *can* increase the contrast and achieve a higher output from the card means that the card is not the limiting factor.*
Actually it does roll-off.

http://www.nabs.net/cwatson53/Radeon.JPG


----------



## bblue

What is the scope shot of exactly?


Those cards have some HF rolloff because of the output filtering after the DAC. If that was removed and something like an MP-1 used as the output buffer instead, it would stay pretty flat.


Most cards by default have some amount of in-band rolloff because of filtering.


--Bill


----------



## JeffY

It's not a DAC or filter problem! you can't go above 255,255,255 in RGB!


----------



## techman707

Quote:

_Originally posted by JeffY_
*Tell that to Stacey Spears, Guy Kuo and Micrsoft becuase they disagree.




The thing that clips whites on the video card is going beyond 255,255,255 RGB, nothing exists beyond this, it can't! The clipping is digital NOT analogue.




As long as you don't increase contrast and saturation above 100 because that is the maximum before clipping.*
When MICROSOFT refers to setting the monitor, they assume you are a SINGLE computer user and the signal is going NOWHERE. As to anyone else, THEY'RE DEAD WRONG!!!


As for the levels, YOU stated the levels, I just told you what would happen if your hypothetical (50mv/630mv) was true.


I have no idea what you mean about the last sentence. A properly setup system SHOULD start clip at 101 IRE units. Of course when you use a waveform monitor, you need to set it to roll off the chroma information, otherwise, you will really be off.


----------



## bblue

Quote:

It's not a DAC or filter problem! you can't go above 255,255,255 in RGB!
Of course not, in any format. But at that level to the card, it will put out very close to .7v p-p, usually from .68 to .72, sometimes even higher depending on the hardware config.


The issue is getting the digital levels correct without crushing before they get to the card. That's what ffdshow does.


--Bill


----------



## mp20748

Quote:

_Originally posted by techman707_
*


For the purposes in this forum, the cards output should be properly set with the overlay controls for signal level, pedestal and gamma.

*
Bingo!


I have no idea what is happening in the digital domain on a video card. My ONLY concern is being able to adjust the analog output levels ("signal level, pedestal and gamma") as close as possible to reference. If that cannot be accomplished, the card is broke, period.


So far this has not been a problem. And for the slight indifferences in the various cards, external means have worked well.


Ok, now we know that the analog signal from the video cards is not PERFECT, and since we can't remanufacture the cards to our personal likings, we should email our complaints to the cards manufacturers.... Then, maybe we'll get this thread back on track.


----------



## techman707

Quote:

_Originally posted by mp20748_
*Bingo!


I have no idea what is happening in the digital domain on a video card. My ONLY concern is being able to adjust the analog output levels ("signal level, pedestal and gamma") as close as possible to reference. If that cannot be accomplished, the card is broke, period.


So far this has not been a problem. And for the slight indifferences in the various cards, external means have worked well.


Ok, now we know that the analog signal from the video cards is not PERFECT, and since we can't remanufacture the cards to our personal likings, we should email our complaints to the cards manufacturers.... Then, maybe we'll get this thread back on track.*
WELL SAID !!!


----------



## Chuchuf

I scoped the 5700 today in Overlay and VMR9 and it looked pretty darn good.


Terry


----------



## techman707

Quote:

_Originally posted by Chuchuf_
*I scoped the 5700 today in Overlay and VMR9 and it looked pretty darn good.


Terry*
What was the pedestal (setup) level at? Was it at 0 IRE or 7.5 IRE and was there ANY adjustments done on the overlay settings?


Assuming all the settings from the card are all right and the projector is calibrated "normally"(factory normal), how did it look when feeding it DVD's? Did it need any gamma adjustment?


----------



## JeffY

If you are using the video overlay then the YCrCb conversion is done as follows, Y=16 = RGB 0,0,0 The analogue output should measure at or close to 0mV. All data below video black (Y=16) is discarded. For white Y = 235 = RGB 255,255,255. The analogue output should be at or close to 700mV. All data above white (Y=235) is discarded. If you increase contrast above 100 in the overlay settings you will discard data at both ends so that effectively black is greater than Y=16 and white is less than 235, you will get the same effect with chroma if you increase saturation above 100. If you decrease brightness below default (0 or 7.5 for TT) you will crush blacks and if you raise brightness without decreasing contrast you will clip whites. As you can see there is very little scope (excuse the pun) for changes.


If you are using VMR then YCRCb conversion is done with Y=0 = RGB 0,0,0 for black and Y=256 = RGB 255,255,255 for white, this allows more space for colour space conversion and maintains all the below black and above white video information. Video black Y=16 now becomes RGB 16,16,16 and video white becomes 235,235,235. From a analogue output voltage point of view, video Black (Y=16) will around 50mV and video white (Y=235) will be around 630mV. Stacey Spears, Guy Kuo and Microsoft argue that these voltages are fine and that you merely have to reduce brightness and increase contrast on the display to compensate. I happen to think this is rubbish and I'm really surprised that Stacey and Guy think this is a good idea. You may be able to use settings in FFDshow to alter this behaviour, although I think you would be better off if you had started with the same colour space translation that is done in the overlay, baring in mind that every time you do a colour space translation and change video settings you are loosing detail and increasing the number of roundup errors.


In a perfect setup (that doesn't exist) the colour space conversion would be done the VMR way (Studio RGB) but RGB 0,0,0 would have a negative voltage value, RGB 16,16,16 = 0mV, RGB 235,235,235 = 700mV and 255,255,255 greater than 700mV.


----------



## Chuchuf

Jeffy,

There is nothing I can see that you can do to these Video cards in their stock configuration that I know of to get them to go neg voltage or above their highest output You will simply hit the rails and that is it. There isn't anything else.

If you scope so that 0ire is above 0v then you will not have the black levels and detail that you are looking for. It will not achieve black. So (to answer Bruce) I set 0ire at 0v. Set 100 ire at max v whatever that is, and then adjust the height of the steps in a step grey scale pattern so that they are equal (or as equal as I can make them)

Bruce I haven't had a chance to look at this HTPC on my PJ yet but I expect it to look pretty normal. I have a program called Displaymate Pro that has some very clean patterns and allows you to scope the desktop using a step grey scale pattern. This, when calibrated, can act as a pretty good "reference" for the projectors allowing you to generate the Colorfacts patterns for grey scale calibration of the PJ as well as set up the brightness and contrast of the PJ. Once this is done and you have scoped the overlays of the DVD player, it's usually darn close. But I will have a carefull look at it in a day or so.

Jeffy, So what you are saying is that using the way that I calibrate and with Overlay I am missing the blacker than black and whiter than white info?? Because I have essentially hit the rails and clipped them.

And with VMR9 I am right on?? or is it visa versa??

After thinking about it a bit more, I see what you are saying Jeffy. I don't think there is any choice but to do what Guy and Stacey are saying if the conversion is in fact what you describe above.


Terry


----------



## bblue

JeffY,

You're right -- it's rubbish. There's no logic (IMO) of maintaining the 16-235 mapping inside RGB colorspace. That's throwing away a lot of resolution steps in the DAC.


But again, you have to see all this on a scope for it to make sense. Whatever black is defined to be in a calibration, blacker than black can't exist unless that standard is slightly above 0 IRE, which it frequently is. Likewise for white at 100 IRE.


Studios and production houses don't use the same calibration from DVD to DVD, either. There's a true zero level baseline as well as a true 255 level baseline that you just can't exceed. If you want the capability of blacker than black your 0 IRE video reference has to shift upward slightly. Same thing with whiter than white, your 100 IRE video reference has to shift downward a bit to allow for it. This is quite apparent when watching a scope during playback. Whatever shift there might be, it is nothing as high as 16 on the black end or as low as 235 on the white end.


I've been watching HD and DVD content on the scope lately, and it's quite an eye opener. You clearly see the 0 baseline from the DAC, but the black level from the DVD will vary anywhere from 2-10 IRE above that, usually more like 2-5. Whites are usually slammed at the top, .7v 100 IRE. Each production scene will generally have level manipulation going on and will not necessarily use the same black or white reference level as the scene before or after it (that's for effect, usually), and certain scenes will have some blacker-than-black (blacker than their defined video black standard, but still above 0 IRE).


With software like ffdshow it is incredibly easy to expand your component space to RGB and at the same time adjust exactly how you want to align their video black and white to be, and how much headroom you want on each end. You can see instantly where they are. If your projector is properly calibrated you'll see the exact relationship that the scope shows.


Another interesting side effect of this type of calibration and accuracy is that you seldom need gamma adjustments!


Watching HD on a scope is interesting too -- you find that it's a different animal than what you see on DVD. While there is an absolute 0 black reference which you cannot go below, various stations/channels have vastly different approaches in level management from source to source. Some will have a fixed 7.5IRE black level, others will have different standards depending on the source, from 0 to 5 or so. Whites will mostly cap at 100 IRE .7v, but certain graphics, live hi-res outdoor scenes will extend past .8v (soft peaks) from 0 as their blackest black. It's quite something to scope out. Movies will sometimes be from 0 to 100, 0 to 90 (or so), 5 to 100, 7.5 to 100, etc.


Terry,

Be careful on Colorfacts when expecting the Desktop and Overlay levels to be very close. Sometimes they are and other times not. On my NVidia 5700 you can adjust levels of Desktop independently from overlay/VMR, and I found the desktop adjustment needed to be 91% of the default to be equal to overlay at '100 IRE'. That's a really significant difference if you're trying to set white to a particular screen light level.


--Bill


----------



## techman707

Bill,


A lot of the confusion with the variety of different signal levels goes back to the early 1980's. The EIA standard called for a 7.5 IRE setup level, while the EIAJ (Japanese Electronic Industries Association) decided to use 0 IRE (with reference to blanking). When digital video came on the scene, the Japanese lead the way in it's development and 0 IRE became the defacto standard for digital video. Now along came all the telecine transfers for all the movies that began to show up on VHS tape. If the transfers were done on regular telecine equipment (projector and camera) a 7.5 IRE setup was always used, however, when some of the transfer houses started to install the newer Rank Cintel HD digital Flying Spot Scanners to transfer film, most used a 0 IRE setup using the Japanese standard. That's one of the reasons you have all these "different" signal type DVD's floating around. Some (cheap) companies used their old (and poor) analog masters to make their DVD master from, while other had new HD masters made for their DVD's, figuring that they would already have the HD master when (or if) they decided to release the picture in HD. The new transfers made with the Rank Cintel units have the highest contrast ratio and the best gamma characteristics. When all these varied type signal are broadcast for regular TV, despite what kind of signal comes through to the transmitter, it will impose a 7.5 IRE setup to it (otherwise it could damage the transmitter tubes if the signal dips below blanking). So if ALL these signals are not properly processed BEFORE they are sent to the transmitter (which many times they're not) a poor signal will be broadcast.


ANY signal can be processed to bring it into conformity so when it's viewed on a properly calibrated monitor NO further adjustment of the monitor will be necessary. In the case of DVD's, that should be done as far back in the chain as possible, which in the case of HTPC, would be the overlay adjustments (or ffdshow if you run it).


Bruce


----------



## bblue

Quote:

So if ALL these signals are not properly processed BEFORE they are sent to the transmitter (which many times they're not) a poor signal will be broadcast.
...except for HD broadcasts, in which case they are all over the place like I described.


That's an interesting history for how it came about. I'm sure there's even other variables over the years than just those.


--Bill


----------



## techman707

Quote:

_Originally posted by bblue_
*...except for HD broadcasts, in which case they are all over the place like I described.


That's an interesting history for how it came about. I'm sure there's even other variables over the years than just those.


--Bill*
Yeah, there were MANY MORE variables, but none of them caused quite so much trouble as when they created two pedestal standards.


HD broadcasts are a different animal, but most broadcasters STILL have to feed "old" analog signals and at this point in time "some" aren't too particular about correcting them BEFORE they are converted. However, in time, when all the signals are natively digital, everything should really improve.


Here in NY, the FOX affiliate has a BETTER HD signal than ABC, go figure.


----------



## mp20748

Quote:

_Originally posted by techman707_
*


Yeah, there were MANY MORE variables, but none of them caused quite so much trouble as when they created two pedestal standards.

*
Yep, and that is the main reason why I ignore the theory on video cards. The 0 to 255 digital rule only makes sense, IF the pedestal is constant and relevant to that theory, but in reality- it ain't!


----------



## techman707

Quote:

_Originally posted by mp20748_
*Yep, and that is the main reason why I ignore the theory on video cards. The 0 to 255 digital rule only makes sense, IF the pedestal is constant and relevant to that theory, but in reality- it ain't!*
RIGHT! While you could see people screwing up the settings on their projectors, there's NO reason why the people making DVD masters can't GET IT RIGHT. But it's something that we'll have to live with (for now) until they FINALLY wake-up.


----------



## Chuchuf

Bill,

But that is the point. You can set up the desktop to exact values from 0 to 100ire, giving you a very good reference to calibrate your PJ to getting it's contrast and brightness in the ball park. If you then have a good reference disk you can set up the overlays in your DVD software by scoping them and then use colorfacts to set your grey scale.

Bruce, you asked if some gamma was needed and the answer is yes. We are trying to calibrate the default settings that are being used by NVidia in their video properties to get then correct.

I am finding with this technique that when I review on my G90 with reference material I know intimately, the setting I scoped are correct.


Terry


----------



## techman707

Terry,


It seems that it's "mostly" DVD's that always appear to have bad gamma, however, the majority of HDTV signals appear good. Of course maybe the reason is that I'm not seeing lousy film transfers on HDTV.


I think the answer for the gamma problem is to use a program like TheaterTek that stores the setting from each DVD. For most people though, that can be a real PITA.


Bruce


----------



## JeffY

Terry, humor me, what overlay settings did you come up with?


----------



## pcgeek

Since most of this is WAY over my head I've been trying to stay out but I want to make sure one point isn't missed. I have a BIG problem using something like ffdshow to compensate/color correct because it is still working in 8-bit digital space. Any expansion/compression/gamma will result in losing the continuity of the signal.


Overlay controls or ICM profiles for VMR I'm better with because I believe they work in the card's hardware directly and are probably 10-bit or higher (I could be wrong with this).


I'm not sure what level of control the projector has but I believe it's more continuous/analog than working in 8-bit digital space.


For my system I use a single source so I have no problem adjusting the projector to get as close as possible and then using a color profile on the PC to tweak the curves.


-Pat


----------



## Chuchuf

Jeffy,

Since there isn't a sliding scale with numbers on it in the Video Decoders I'm using, I can't give you the exact numbers for the overlay in TT. I can just tell you they were a bit different from the default ones that were contained in this when we got the filters. We are doing them for Overlay, VMR7 and VMR9. I'm not sure you could relate to the settings anyway as these decoders and the drivers aren't released yet. So you really wouldn't have a reference to put it up against if I did pull the settings out of registry for you.


Bruce, Video based HDTV has good gamma, HBOHD film based is pretty bad sometimes.


Terry


----------



## techman707

Terry,


I was only referring to OTA broadcasts. I've never seen HBO HD, or any cable or satellite HD for that matter. I used to have satellite (which overall was pretty good). but don't any longer. I recently bought a Samsung SIR-T360 combo satellite/OTA box, but I only use it for OTA broadcasts. It works great and also has video inputs so you can connect a DVD player and scale it up to 1080I.


Bruce


----------



## JeffY

The defaults in the old TT are completely wrong, it maybe that they have been transfered to the new version.


----------



## techman707

Quote:

_Originally posted by JeffY_
*The defaults in the old TT are completely wrong, it maybe that they have been transfered to the new version.*
How could the "defaults" possibly be right? With all the different levels on cards, etc., they would have to be adjusted for YOUR setup.


----------



## JeffY

I never said what is right, I just said the TT defaults are wrong. 


Any settings that clip whites or crushes blacks are wrong, the default TT settings clips whites by extending video whites beyond the RGB limit.


----------



## mp20748

Quote:

_Originally posted by JeffY_
*I never said what is right, I just said the TT defaults are wrong. 


Any settings that clip whites or crushes blacks are wrong, the default TT settings clips whites by extending video whites beyond the RGB limit.*
What happens to the IRE window, if/when the pedestal rises above the preset level?


And why would the pedestal move from its preset level?


----------



## bblue

JeffY,

TT supports ffdshow, so correcting it can't be that difficult.


+Chuchuf,

At least with the NVidia 5700 and the drivers I'm using (latest as of 2 wks ago) I found NVidia's overlay settings to be fine. Just lower the desktop output to 91%. For HTPC use, using ffdshow to adjust the transition from DVD to overlay works very well. I doubt seriously you could accomplish this correctly with just driver overlay adjustments, TT adjustments, and no ffdshow.


--Bill


----------



## Chuchuf

Jeffy,

No the defaults from the old TT aren't being transfered to the version I am looking at. The overlay adjustments we are doing are to the NVidia Decoder and that doesn't have a numbered sliding scale to reference. Kind of a pain in fact to see where you are when you make adjustment.


Bruce, as it works out I don't have OTA here just cable so my 4 HD OTA channels come in via Comcast.


Terry


----------



## techman707

Terry,


I wonder what happens to the HD cable signal AFTER it passes through Comcast's head end.


Bruce


Jeff,


What the difference WHAT the default setting are, since they have to be adjusted for you system anyway?


----------



## JeffY

Quote:

_Originally posted by mp20748_
*What happens to the IRE window, if/when the pedestal rises above the preset level?


And why would the pedestal move from its preset level?*
There isn't a pedestal as such, all controls are purely digital.


There is no digital processing room either side of black and white, if you raise brightness then white detail disappears, if you decrease brightness black detail disappears. If you raise contrast they both disappear. The overlay is so simple, there is nothing to calibrate. video black is the lowest black possible and video white is the whitest white and from an analogue point of view black is 0mV and white is 700mV (or damn near it). The default settings are in essence perfect and can only be made worse by changing them.


With TT the main problem is that the default contrast (105) is too high and everything above 97-98IRE is clipped.


----------



## JeffY

Quote:

_Originally posted by bblue_
*JeffY,

TT supports ffdshow, so correcting it can't be that difficult.
*
TT 1.5 only supports ffdshow using the Sonic software only mpeg decoder which is generally poor quality and has bad chroma bug and I-frame pulsing problems. TT 2 should be much better with ffdshow, altough any extra colour conversion or processing is bound to produce more roundup errors.


----------



## bblue

Quote:

...altough any extra colour conversion or processing is bound to produce more roundup errors.
With the quality and capabilities of DVD as they are, I doubt the rounding errors are going to contribute that much.


--Bill


----------



## techman707

Quote:

_Originally posted by JeffY_
*There isn't a pedestal as such, all controls are purely digital.


There is no digital processing room either side of black and white, if you raise brightness then white detail disappears, if you decrease brightness black detail disappears. If you raise contrast they both disappear. The overlay is so simple, there is nothing to calibrate. video black is the lowest black possible and video white is the whitest white and from an analogue point of view black is 0mV and white is 700mV (or damn near it). The default settings are in essence perfect and can only be made worse by changing them.


With TT the main problem is that the default contrast (105) is too high and everything above 97-98IRE is clipped.*
I'd sure like to SEE what your signals look like !!! How do you know whether it is the TT defaults or your overlay settings that are wrong? Also, if you're are loosing black detail when the pedestal is set correctly, then it would appear that you might have the wrong gamma. Again, I would love to see what you're seeing.


----------



## ChrisWiggles

Bill, the resulting color banding can be quite visible indeed.


----------



## techman707

Quote:

_Originally posted by JeffY_
*There isn't a pedestal as such, all controls are purely digital.

*
What do you mean by "purely digital", since in the end, you're going to be looking at the "analog representation" of the adjustments anyway? What's the difference "what" the controls are operating on?


----------



## techman707

Quote:

_Originally posted by ChrisWiggles_
*Bill, the resulting color banding can be quite visible indeed.*
Where did that come from? Are you talking about using ffdshow?


----------



## ChrisWiggles

never mind. 


I got my threads confused. Perpetually speaking.


----------



## JeffY

Techman, the graphics card uses RGB, black is 0,0,0 and white is 255,255,255. Everything in between gives you your colour, greys etc. Whatever analogue signals the card outputs at 0,0,0 and 255,255,255 these can not be altered in any way. Hopefully they will be 0mV and 700mV or at least close enough. The DVD is encoded in YCrCb digital component, if you use the hardware overlay, YCrCb 16-235 luma and 16-240 chroma is converted to RGB 0-255. All the overlay settings do is alter the way digital component video fits into RGB. The default overlay settings are those that do a straight conversion without adjustments. The reason TT has different defaults is that the author decided to use the overlay settings Cliff Watson came up with using his scope. Cliff now understands how the overlay settings work and he now longer uses the scoped settings. Now if you slightly increased brightness and decreased contrast then there might actually be some benefit since YCrCb 16-235 doesn't fit properly into RGB 0-255, because you get values that are out of range. This basically gives you some wiggle room at both ends.


----------



## mp20748

Quote:

_Originally posted by JeffY_
*

The reason TT has different defaults is that the author decided to use the overlay settings Cliff Watson came up with using his scope. Cliff now understands how the overlay settings work and he now longer uses the scoped settings. Now if you slightly increased brightness and decreased contrast then there might actually be some benefit since YCrCb 16-235 doesn't fit properly into RGB 0-255, because you get values that are out of range. This basically gives you some wiggle room at both ends.
*
This has been discovered some time ago, and that is why we have been saying that we do our own scoping.


Now, let's get back to your original claim, that the levels cannot be adjusted using a scope, and that it has be be done using test patterns.


Now explain why a scope will not work?


----------



## JeffY

Are you saying that the point of clipping changes with each card?


----------



## techman707

Jeff,


I take issue with some of the things you say, however, it's too involved to go into here now.


However, the "conversion" you refer to takes place in EVERY projector that has component inputs or uses a transcoder. In fact, the standard for the the encoding matrix has NEVER been finalized by the SMPTE. At this point in time the "current" matrix is only an SMPTE "Recommended Practice". The conversion from component to RGB results in a slightly inferior signal. However, in the case of the overlay, it's possible to "create", in effect, a "custom matrix" .


While a particular diver version may not have been coded to take advantage of it, it is certainly possible to code a driver to control ALL the parameters necessary to adjust the "digital parameters" so as to (reasonably) output an accurate analog signal from the display card.


I don't know how Cliff Watson went about arriving at the TT "default" settings, what I do know is that a scope alone wouldn't be enough. A lab grade vectorscope would be necessary as well. When the luminance signal is measured, if the chroma information isn't rolled off, the signal levels will be totally off the wall. For maximum peak white detail, the "average peak white level" shouldn't go over 90% of the maximum peak white. When it comes to film transfers, which is the most difficult material to reproduce because of it's wide dynamic range, you can NEVER get the same contrast range with video (at this time, without "some" compression. This is where ACCURATE gamma setting become very important. Unfortunately, where DVD's of film transfers are concerned, there is NEVER going to be a ONE SETTING FITS ALL.


Bruce


----------



## JeffY

I'm not one for changeing settings trying to get something that works for each movie, it would drive me nuts. For me one setting has to fit all.


----------



## techman707

Quote:

_Originally posted by JeffY_
*I'm not one for changeing settings trying to get something that works for each movie, it would drive me nuts. For me one setting has to fit all.*
Then you're in trouble.


----------



## JeffY

Actually if your into this TT is perfect because it remembers the settings per movie.


----------



## Chuchuf

Jeffy,

I'd like to hear more about your theories on scoping the card and what you think on this issue.

When scoping, of course I am using test patterns to determine what the output needs to be set to.

But that is just an "ideal" case and certainly doesn't apply to all DVD transfers. That is unless they are all mastered the same and I seriously doubt they are.

But then that is the point in that it will allow you to set up an absolute known, that is correct on the HTPC. If then put into your video chain, you should calibrate it to this known.

I would say look at Master and Commander as a prime example of a very dark transfer that needs the brightness kicked up a bit to look good. And yes that is the beauty of TT, it memorizes the overlay settings for just that DVD, unless you want them set up as a default. Pretty neat feature. In fact, yesterday would be a prime example where when I was done scoping the overlay I put this test machine in my system and ran test patterns to insure that my PJ's contrast and brightness were set up correctly, then ran the 5th element usual scenes to insure that the scoped values were correct (they were). I then decided that I wanted to watch Torque (bad movie!!!) and the difference was dramatic. So much so that I rescoped the card. Not a very goor transfer in that regard.

I don't know where Andrew originally came up with his default values for TT but I can tell you that those are not what I am working with now.


Terry


----------



## techman707

I amused by all this, we're arguing about "default" settings when even if they were PERFECT, it would depend on what DVD you used to judge it. Some DVD are perfect and look stunning (few and far between), while "most" have SOME deficiency.


The WORST color example I've seen is the DVD of It's A Mad, Mad, Mad, Mad, World. This is a picture I've seen at LEAST 1000 times since 1963. At the intermission on the title "intermission and at the end of the picture when it says "the end", the background color is supposed to be SOLID YELLOW. On this DVD (the only version ever released on DVD) it's solid RED.


That's what you call REAL quality control! The transfer was done on the finest equipment available today, a Rank Cintel HD scanner and was "color corrected" using the latest DaVinci color correction system. The company that did the transfer was Sunset Digital in LA. The bottom line is, as usual, is that you can have the finest equipment in the world and it STILL boils down to some idiot who doesn't care what he does. Even the "old" analog DVD has the correct color.


Jeff,


Good luck with your "one setting fits all" setup.


Bruce


----------



## deronmoped

Bruce


I could not agree more, the DVD's I have seen are all over the place in the more obvious things, like contrast, sharpness and EE. I have no way of judging some of the more suttle things, but who knows how much those vary.


Let's hope with a new format a lot of those problems are solved.


Deron.


----------



## JeffY

Quote:

_Originally posted by Chuchuf_
*Jeffy,

I'd like to hear more about your theories on scoping the card and what you think on this issue.
*
Not just me, Bjoern Roy said on the subject.

Quote:

Using a scope is not only not necessary to set the output of a card, its actually conceptually inappropriate.
So how do you guys watch movies? Watch a bit, get the scope out, do a quick recalibration, note down the values (TT does this bit for you)? Do you then go back to the beginning or just continue from where you left off? What happens when things change mid movies as they often do? Do you get the scope our again or leave it?


----------



## RoBro

Would it be allowed to "remaster" a DVD for correct values (and maybe also for correct progressive flags) and sell them, if you sell it ONLY to people who already have the original DVD?

Roland


----------



## bblue

Quote:

So how do you guys watch movies? Watch a bit, get the scope out, do a quick recalibration, note down the values (TT does this bit for you)? Do you then go back to the beginning or just continue from where you left off? What happens when things change mid movies as they often do? Do you get the scope our again or leave it?
If your monitor/projector are correctly calibrated and you *know* what to expect from it as a result, and your playback equipment is also calibrated and you *know* what to expect from it (and can tell the difference between the two), it is relatively easy to make a quick and safe adjustment with little or no side effects during playblack on the fly. It's when the relationship of these areas is not known, one or the other or both are not calibrated, and you have no visual standard, that it can become a nightmare.


When a calibration standard is utilized you can pop in a DVD or any source and immediately see exactly where it fits in the quality and balance ranges, and adjust accordingly. Not surprisingly, with everything calibrated correctly, you don't *need* to adjust every DVD and you find that many that you thought were off, really aren't. They just have their own look that most of the time you can't adequately correct for anyway.


You don't rely on the scope to tell you what it looks like or what to adjust, you rely on it (and a properly calibrated projector) to help set your standards so that your video chain can reproduce whatever is fed to it in a neutral way. Then based on that, you can fine tune a particular source accordingly. In that endeavor the scope can also be useful.


--Bill


----------



## Chuchuf

Like Bill said. It's a reference that you do once.

But this thread and the other one are interesting in that they have provoked me to think about this a bit. From what I am hearing, what has been suggested is that 0ire should be at some small voltage above 0V and 100 ire should be at some voltage below .7v. I can tell you if you set up this way you will end up with a picture that has grey blacks and less that desired contrast on the whites. It was then suggested that to accomadate for these higher and lower levels we could set up our contrast higher and brightness lower. Do they mean on the PJ or display device??


Terry


----------



## bblue

They were originally saying to map YUV colorspace 16-235 directly to RGB colorspace to overlay, and with 16 min 235 max digital driving overlay to the card, adjust the PJ/monitor to suit the modified levels coming from the HTPC. I still don't understand why this should be necessary since you can accomplish correct output levels with no crushing, and even accomodate whiter than white if you choose to -- with the right tools.


--Bill


----------



## Brian Morris

I have been reading this thread for a while now and decided to buy a 5700 ultra last week. I bought the evga 5700ultra to replace my ATI R9700pro.

After cleaning all the ati drivers and programs out and installing the Nvidia stuff...

The first thing that I did notice is that the desktop is much cleaner. After(still) fulling around with powerstrip to get my settings back, I finally broke out my new copy of Fifth Element superbit. WOW this card is nice... Great colors and pretty sharp...

Oh ya and the best thing about this card is... using Dscaler, Zoomplayer, Fusion HDTV and just about anything I through at it... IT HAS NEVER LOCKED UP, froze or gave me the reboot boot. With my R9700pro I would have at lease a reboot once every 3-4 hours of viewing. 5700U going on about 4-5 days...

I know that this has all been said before, but I thought I would throw in my 2c's.


Brian


----------



## jcmccorm

Here's my 2cents as I swapped out my 9500Pro for the 5700Ultra a couple of nights ago. The desktop is sharp and clean. This thing is driving 30' of 1694A cable and doesn't have a problem with it. I'm impressed. I'll probably add Mike's MP1.3 when it's available but so far, this card looks really good.


Cary


----------



## deronmoped

What did your 2 cents set you back?


Deron.


----------



## Lifter

I just did the same based on this thread. Retired the 9500pro and got a BFG 5700U. The desktop is drastically better. TT seems a lot worse unfortunately. Very dissapointing. But newer decoder filters seem to look better, so I guess I'll just wait for TT 2.0.


----------



## maneuen

Sean, are you running ffdshow w/ theater tek.


I recently changed from a 9500 to the same bfg 5700 ultra. As others I found that the desktop was sharper....but I also felt that in TT there wasn't much of a change.


That opinion changed once I turned off ffdshow. Strictly video card vs. card the 5700 was much cleaner and at least in my case the colors were more natural and at the same time more defined...a definite step up in my case.


I haven't turned ffdshow back on yet, but I'm sure the two cards take different settings...so if you are holding over your ffdshow settings from your ati card, that may explain why TT doesn't look as good....and don't forget to run avia again...as the new card probably wiped out your "default" settings in TT (at least it did in mine).


Once I reset the proper levels in TT....the 5700U with no ffdshow looked much better than the 9500 w/ ffdshow.


-Mike


----------



## Brian Morris

I was wondering....

I my friend has a scope and I wanted to set my levels on my 5700u and htpc. Is there somewere or someone that has the info on how to do this?

I know how to work the scope, Just wondering what are the levels that there suppost to be... .7v? non clipping?


Thanks

Brian


----------



## Chuchuf

Just got done watching TT2 w/ VMR9 scoped settings and it looks fantastic. VERY clean. We did both an 0ire=omv, 100ire=100MV and a 7.5ire NTSC setup. I am certain that the 0 setup looked much better. So I am of the opinion that this is the way it should be set up. Others will disagree but that has been covered in another thread.

I also likes the interaction of the brightness controls better on the VMR than the overlay.

Moe testing to be done.

If you are thinking about an NVidia card, take a look at the PNY's. I really like their heat sinks a lot better than some of the others.


Terry


----------



## bblue

I've been recommending specific colorspace settings in ffdshow which look quite good on the scope. That is an Input setting of Raw,YUV, Levels as the first filter with Input White set at 235 and Output Black at 16, and the Output set at YU12. That is still correct for ffdshow versions up to and including 20040629.


The odd Levels Filter settings didn't really make sense as documented but that's the way they produced the right ranges when viewed on a scope. Apparently, this has been a long term bug in ffdshow which has finally been corrected starting in the July 2004 releases. In these releases, both settings changes occur on the Input side of the Levels filter where they should. So Input Black should be set to 16 and Output White set to 235. If you're watching on a scope, you can fine tune these to exactly the right place with a test DVD along with minor Output Filter changes if warranted.


There are a number of speed-up and performance improvements throughout these newer versions, so be sure and read the README files. Both lanczos and bicubic algorithms in resize have been modifed to work more efficiently in certain ranges. These mods make quite a difference!


--Bill


----------



## techman707

Quote:

_Originally posted by Chuchuf_
*Just got done watching TT2 w/ VMR9 scoped settings and it looks fantastic. VERY clean. We did both an 0ire=omv, 100ire=100MV and a 7.5ire NTSC setup. I am certain that the 0 setup looked much better. So I am of the opinion that this is the way it should be set up. Others will disagree but that has been covered in another thread.

I also likes the interaction of the brightness controls better on the VMR than the overlay.

Moe testing to be done.

If you are thinking about an NVidia card, take a look at the PNY's. I really like their heat sinks a lot better than some of the others.


Terry*
Terry,


It really doesn't matter what basic setup (pedestal) you use, however, if the 0 IRE setup looked better, you're probably using a pattern that is 0 IRE. It's more important to use the correct setup with the material you're feed the pj.


You often see people asking about whether their high "brightness setting" is ok. The answer is, if the projector was setup to 0 IRE, then when they run broadcast material that is, at this point, mostly 7.5 IRE, they're going to have to raise their brightness control. You just have to be sure they have enough headroom to enable them to get the proper brightness.


Bruce


----------



## jcmccorm

Say Bruce, isn't that opposite? If you're used to getting 0IRE for black then your source changes to 7.5IRE, shouldn't you lower brightness? Just being picky... 


Cary


----------



## Chuchuf

Bruce,

That was why we did two setups. So that I had the choice and could easily change based on the material.

It's interesting that the DVD manufacturers have known about this and adjusted for it in DVD players in the saetup generally giving you a lighter and darker control in most modern DVD players.

So the theory is that this may also apply to TT2.0 and Andrew _may_ have two settings (0 and 7.5) that will allow you to choose.


Bill,

Wouldn't mind experimenting with this but this early beta version of TT2 we are working with doesn't see to have (and I might be absolutely wrong here) the post processing enabled.


Terry


----------



## techman707

Quote:

_Originally posted by Chuchuf_
*Bruce,

That was why we did two setups. So that I had the choice and could easily change based on the material.

It's interesting that the DVD manufacturers have known about this and adjusted for it in DVD players in the saetup generally giving you a lighter and darker control in most modern DVD players.

So the theory is that this may also apply to TT2.0 and Andrew _may_ have two settings (0 and 7.5) that will allow you to choose.


Terry*
Terry,


You would think that the people making masters for DVD's would SETTLE on ONE setup, whether 0 or 7.5 IRE, so the average user wouldn't have to adjust their TV everytime they put in a different disk. 


They know how to put FLAGS in the signal to prevent digital copying, they should put a flag in for switching setup levels. 


Bruce


----------



## Mark_A_W

Well I already deal with PAL and NTSC, different reg settings, different res's, different pj setups.


This 7.5ire black is for NTSC dvvd onlr right? Everything else is 0ire?


----------



## Chuchuf

Mark,

I believe you are correct. 7.5ire is an NTSC thing.


Bruce,

As usual I was thinking the same thing. Flags on the DVD to set up the brightness and contrast. Great minds following the same path....lol


Terry


----------



## techman707

Quote:

_Originally posted by Mark_A_W_
*Well I already deal with PAL and NTSC, different reg settings, different res's, different pj setups.


This 7.5ire black is for NTSC dvvd onlr right? Everything else is 0ire?*
Well.....7.5 IRE is SUPPOSED to be for NTSC, but the DVD's seem to have....Whatever the hell they feel like using. That's the problem


----------



## Chuchuf

That's the way I see it Bruce. You never know with a DVD.

Now here is were it get's even fuzzier. What is WMV HD content?? Is it 0? is it 7.5??

I was told last week that there are a set of standards that MS wrote that the studios should be following when then convery to WMV HD. But from what I understand they don't have to, and these are only suggestions.

But my questions is, what reference is WMV HD being encoded to in the specs MS has written?? Is it NTSC 7.5?? or does it follow the ATSC HD spec of 0. Anyone??


Terry


----------



## techman707

It SHOULD be 0 IRE, but since we're dealing with some of the very same people that can't get the DVD's right...or at least uniform, I wouldn't assume ANYTHING.


----------



## Rickd

I have an mp-1 on a radeon 9700 pro. Is there documentation on putting it on to a 5700ultra or 5950?


Also I have a Barco 808 projector and port 3 has far better electronics than port 5. Trouble is port 3 is a 9 pin D9 connection. I crrently run a 5 way bnc from the mp-1 output to 5 way bnc on the barco 808 that I have attached to the port 3 input board of the barco....this is in parallel with the D9 which is connected out to the chasis connector. ie the 5 way input is soldered directly to the rgb input board to give me 5 way. I do however get some minor ghost noticeable up close which leads me to believe I am not optimised for this input or their is a slight impedence miss match given this board is diesinged for an hd9 which I think is 50ohm (correct me if I am wrong).


I was wondering what the effect would be for me to solder a quality hd 15 onto the back of the bnc's on mp-1 and then connect my quality silver serpert hd15 to hd9 from the mp-1 to the projectors hd9 input. I have a feeling this will match the impedence better at the projector end. I then eliminate the modified 5 way bnc connection to this board.


What are your thoughts on utilising the better input card on the projector while trying to achieve the best impedance match in this situation?


Thanks Rick


----------



## mp20748

Rick,

the MP-1 should retro fit on either the 5700 or 5950. The heatsink used on the 'make' will determine the ease of doing it. At present I have no documentation or photos on doing this. I will have photos sometime soon, however someone else will be taking over attaching the mods to the cards in the very near future.


If you want to go HD15 (VGA connection on back of card), it would be best to attach the signal from the output of the MP-1 to the HD15 (disabled) connector on the rear of the 9700.


The DB9 and HD15 connectors seem to not have impedance measurements -- at least I've never known of any.


If you can attach a DB9 or HD15 connector onto a 5 BNC cable, that cable is too small for high performance video. If so, expect ghosting when connected to HD15 and DB9. If you'll using high grade RG59/RG6, then the impedance of the HD15 and DB9 should be less of a concern.


----------



## VideoGrabber

Just an FYI, as per Extron tech notes, the impedance of an HD15 VGA connector is ~100 ohms. Their spectral/transient analysis showed minimal impact from the impedance mismatch to 75 ohm transmission lines, due to the short lengths of the connector involved.


- Tim


----------



## Rickd

Thanks Mike for the response


So just to clarify what you are saying is ....."pass thru the output of the MP1 to the hd15 on the card and use my existing run of silver serpent cable with HD15 from back of card to HD9 to the projector.


Yes?


To do that what do I do disconnect the cables output to bnc and jumper it the hd15 no?


or get another MP1.3 and add it to the 5950 and pass it thru the cards hd15 connector rather than bnc output yes?


Thanks Rick


----------



## mp20748

Quote:

_Originally posted by Rickd_
*Thanks Mike for the response


So just to clarify what you are saying is ....."pass thru the output of the MP1 to the hd15 on the card and use my existing run of silver serpent cable with HD15 from back of card to HD9 to the projector.


Yes?


To do that what do I do disconnect the cables output to bnc and jumper it the hd15 no?


or get another MP1.3 and add it to the 5950 and pass it thru the cards hd15 connector rather than bnc output yes?


Thanks Rick*
Yes, you can disconnect the cables from the BNC and connect them to the HD15 on the card. You'll only have to connect the RGB, the sync is already in place on the HD15.


The 5950 is a power card. Not sure why you'd want another MP-1 if you already have one. why?


For the 5950/MP-1 I would suggest high grade RG59 or RG6 cables.


----------



## Rickd

I thought you were coming out with a better mp1.3 for this card with low noise floor etc.....would you just recommend moving my old mp1 to the new 5950.


Cheers Rick


----------



## Rickd

I thought you were coming out with a better mp1.3 for this card with low noise floor etc.....would you just recommend moving my old mp1 to the new 5950.


Cheers Rick


----------



## mp20748

Quote:

_Originally posted by Rickd_
*I thought you were coming out with a better mp1.3 for this card with low noise floor etc.....would you just recommend moving my old mp1 to the new 5950.


Cheers Rick*
The best mod for the 5950 is the newer MP-1v3. It is a much higher bandwidth version of the MP-1 than what you have (MP-1) The MP-1.3 is the stand-alone version that was designed for the Holo3 card only.


You present MP-1 will work perfectly on the 5950, but the never versions MP-1v3 takes the 5950 to another level of performance.


----------



## brodgers

When will the MP-1v3 be available and will it work on a 5700?


Thx.


----------



## Chuchuf

Don't know when MP will have it ready for production but yes it should work with the 5700. If you get one get the PNY version as they have much better heat sinking than the others.


Terry


----------



## Rittberg

By whom or how (DIY ? )the MP-1v3 is going to be attached to the 5700/5950 ?

Is there any advantage to purchase the 6800 over the 5950 for HT ?


Thanks,


Rittberg


----------



## stylinlp

Just FIY, you can get a Radeon 9600 with no fan for $90 these days. Having a 10bit DAC is nice over 8bit 9200's and less. Besides the fact of full DirectX 9 compatibility.


I ordered an ASUS 9600XT today from newegg. Only spent $118. Can't wait for that puppy to come in 

Maybe I should MP-1 that.


----------



## mp20748

Quote:

_Originally posted by brodgers_
*When will the MP-1v3 be available and will it work on a 5700?


Thx.*
I'll have the first batch finished this week. And yes, the 5700 is perfect.


At my next HT gathering, I'll have my HTPC, which will have a 5950/MP-1v3. The 5950 is an overclocked version of the 5700, with much beefier on-board power supplies, and much more ram (not needed for video/DVD). The difference between the two (5700/5950) is subtle for DVD and video applications.


My HTPC is loaded with various high performance test patterns running from desktop, and it'll also have some HD movies. I'll also have AVIA PRO and other test pattern software.


This gathering is scheduled for August 15th, I'll leave all comments on this up to the attendees.


----------



## RoBro

Anyone know if the scaling of the Nvidia FX series is as good as that of the 9000 series Radeon cards?

Radeons always used bicubic scaling while earlyer Geforces did not. Do the FX series have that now? And if, do also the lower end FX (5200) have it?

Roland


----------



## mikecazzx

Quote:

_Originally posted by mp20748_
*No, wait on the NVIDIA. It's physically very different from the ASUS 5700. The visual difference is hugh. The NVIDIA has much more (and larger) caps than the ASUS, plus it shows far more components than the ASUS.


So it appears that they both may be using the same processor, but we're not sure that the ASUS has the same "Ultra" feature, or that it's has the same engine that the 5700 is using, because the Nvidia even has a much larger heatsink.


There's no way of looking at the two cards and saying that they are the same. So what would be the difference.


What we are testing with the 5700 Ultra could be a feature that is not in the ASUS, and may not benefit from the DVD software that is being beta tested on the HTPC forum, and offered by Nvidia only.*
Which card or cards are you referencing here? Which exact make or model is the recommended card when switching over from a Radeon?


----------



## brodgers

If the first batch of MP-1v3's are ready this week, when when they be available for the rest of us???


----------



## techman707

Quote:

_Originally posted by mikecazzx_
*Which card or cards are you referencing here? Which exact make or model is the recommended card when switching over from a Radeon?*
I believe Terry was referring to the 5950 Ultra PNY cards with the "better" cooling fan.


----------



## Chuchuf

That's right Bruce. I like the cooling heat sinks they put on the PNY manufacturered 5700's and 5950's. Much larger on the DDR chips and an overall better heat sink design.


Terry


----------



## mbrandt

Terry, Mike or Vern -


What software are you using with the PNY 5700U/5950U? I remember there being some issues with DVD playback using the stock drivers. Has that been resolved yet?


By the way, you guys are like mad scientists. Thanks for all the work you've done.


- Mark


----------



## Vern Dias

My 5950's qre all BFG's, with the reference design heatsink.


I'm running the release 6171 drivers. I have never seen any DVD playback issues with any of the various driver releases.


Since I am now primarily using DVI to a Sharp 12000, the MP mod is no longer a real must have for me.


Vern


----------



## ceru

Outta curiosity. Do these cards utilize 4 tap scaling similar to the radeons or a much more superior 10-12 tap scaling like the H3D II? If not, is the FFDShow resize filter capable of upscaling resolutions without exhibiting scaling artifacts? For example, 1440x960p resolves the 6.75 mhz pattern in avia perfectly whereas 1280x1024 exhibits some scaling error when displaying the same pattern. Thanks in advance to anyone who can pop in and answer this.


----------



## techman707

Quote:

_Originally posted by Vern Dias_
*My 5950's qre all BFG's, with the reference design heatsink.


I'm running the release 6171 drivers. I have never seen any DVD playback issues with any of the various driver releases.


Since I am now primarily using DVI to a Sharp 12000, the MP mod is no longer a real must have for me.


Vern*
What were you using before the Sharp 12K and how do you feel the 12K compares to it? How many hours have you put on the 12K?


----------



## Chuchuf

Mark,

I'm running (testing) TT2 NVidia filters and he latest drivers from NVidia. Testing w/ VMR9 & 7 and overlay.


Terry


----------



## mbrandt

Thanks Terry & Vern -


Terry - I take it that you're using the TT2 filters within Zoomplayer then. Correct?


I remember there being some talk a while back about tearing or pixelation when using Theatertek with the (then current) NVidia drivers. Maybe I was just smoking something...


- Mark


----------



## Chuchuf

Mark,

No I am running TT2 only. Andrew lives close by and comes by to gives me updates for my HTPC which allows him to look at the results on my G90. In fact he is supposed to be over this week with more functionality and newer NVidia stuff this week. Can't wait.

I'm anxiously waiting for MP to get the "new" MP mod for this 5700 card so that we can test the results on it. Andrew has one of the original mods on his 5950 and is quite pleased with it.


Terry


----------



## Vern Dias

Quote:

What were you using before the Sharp 12K and how do you feel the 12K compares to it? How many hours have you put on the 12K?
I still have my Runco DTV-991 in place and use it as a backup projector, or when I don't want to put hours on the Sharp.


Overall, I have to give the nod to the Sharp for delivering a overall more pleasing image. Brighter, much more even illumination, edge to edge it's sharper. Black levels are not quite as good on the Sharp when looking at a fade to black, but the lack of absolute black has never been noticable on any program material (including very dark scenes).


I have about 350 hours on the Sharp.

Quote:

I remember there being some talk a while back about tearing or pixelation when using Theatertek with the (then current) NVidia drivers. Maybe I was just smoking something...
I'd say you were definitely smoking something, then.  The currently version of TT can't be used with the NVidia FWMM.


Vern


----------



## malefactor

MP et all:


Have you found any difference between the ultra/nonultra version of the 5700 for home theater purposes?


"LE" version 5700: $110

"Ultra" version 5700: $200

5950: $500


It seems like price/perf would lie in the PNY branded LE (assuming it's equivalent to the Ultra for us)?


----------



## lawdawg

Quote:

_Originally posted by Vern Dias_
*I'd say you were definitely smoking something, then.  The currently version of TT can't be used with the NVidia drivers.


Vern*
Which NVidia drivers are you guys talking about? I just installed an eVGA 5700 ultra and was able to use TT with DXVA turned off, using the shipping drivers. Anyone know what are the latest and most stable drivers for these cards, in HTPC use?


Can't wait to use VMR9 in TT 2.0!


----------



## Vern Dias

Sorry, my bad. I was reading drivers and thinking FWMM filters.


Anyhow, I have never seen any issues with various versions of the DRIVERS with either ZP or TT.


Vern


----------



## lawdawg

Thanks for the clarification Vern. I appreciate the info.


----------



## Briands

Quote:

_Originally posted by malefactor_
*MP et all:


Have you found any difference between the ultra/nonultra version of the 5700 for home theater purposes?


"LE" version 5700: $110

"Ultra" version 5700: $200

5950: $500


It seems like price/perf would lie in the PNY branded LE (assuming it's equivalent to the Ultra for us)?*
I'm curious about this too. I need to buy a card to get another computer up and running so I figured I'd move the 8500 from the HTPC and replace it with an Nvidea. Then I started looking at prices... OUCH!!! Some of these cost more than my first car... If I don't do any gaming, what card is recomended?


----------



## stylinlp

You can get a Radeon 9600 for $85 with no fan. Thats a 10bit DAC video card. Those are known to have clean video quality. Radeon 9200 and under are only 8bit. Same as with nVIdia cards untill the 5900's


----------



## Nima

I am also curious.


----------



## jimwhite

Quote:

_Originally posted by stylinlp_
*You can get a Radeon 9600 for $85 with no fan. Thats a 10bit DAC video card. Those are known to have clean video quality. Radeon 9200 and under are only 8bit. Same as with nVIdia cards untill the 5900's*
wrong.... the GeForce4 MX cards were the first with 10 bit dacs.....


----------



## stylinlp

No way! Take that back!!!!!!


oh really? I don't know but its just what i've read on here a few times in the past. Radeon is still better 


Radeons rule and nVidia drools!


Has MP scoped a radeon 9600 and up yet?


----------



## jimwhite

Quote:

_Originally posted by stylinlp_
*No way!*


... Way!.....


----------



## BenY

That`s right...

Radeon is better.........

Especially with the latest drivers and the last update to the DVD player.

It don`t get any sharper then that,motion is good too.

When i first got it i was very disappointed,but slowly the drivers improved and now it`s just great.


Yuval.


----------



## jimwhite

Quote:

_Originally posted by BenY_
*That`s right...

Radeon is better.........

Especially with the latest drivers and the last update to the DVD player.

It don`t get any sharper then that,motion is good too.

When i first got it i was very disappointed,but slowly the drivers improved and now it`s just great.


Yuval.*


that puts the comments into context.....


----------



## mikecazzx

Quote:

_Originally posted by Nich_
*Bill, here's the link to the ASUS page.

http://uk.asus.com/products/vga/v9570td/overview.htm 


Nicholas*
Does this provide the same performance as the 5950 that Mike Parker has declared as the best?


----------



## mcpherv

mikecazzx,

I talked to Mike a few days ago, and he stated that he has tested the ultra versions of the cards. That one you have the link in your post for is a normal version of the 5700, and, if you compare it to an ultra model, has far fewer caps in the top right hand corner, probably suggesting a weaker power supply, and lower overall performance.

Vic


----------



## mikecazzx

Quote:

_Originally posted by mcpherv_
*mikecazzx,

I talked to Mike a few days ago, and he stated that he has tested the ultra versions of the cards. That one you have the link in your post for is a normal version of the 5700, and, if you compare it to an ultra model, has far fewer caps in the top right hand corner, probably suggesting a weaker power supply, and lower overall performance.

Vic*
Ok I think I finally have read all of this post.


Looks like a 5700 Ultra will replace my Radeon soon - if its affordable.


----------



## Joe Przybylski

when does the 5700 ultra come out? I can't find it anywhere...


----------



## Joe Przybylski

wait.. is this the one everyone is referring to? But there is also a 5700 ultra PCX...

http://www.newegg.com/app/ViewProduc...170-052&depa=0


----------



## hdtv_lover

Now that the 6800 is in stores, albeit for $500!!!!, has anyone tried it? With it's own built-in video processor it sounds great!


----------



## Briands

It appears that at least some FX5700 Ultras have hardware conflict issues with MyHD cards. I just got a Chaintech AA5700U and I can not make it work with MyHD card.


Anyone else here having problems?


----------



## Vern Dias

I would suspect it's the other way round, ie, MYHD conflicting with the 5700.


That said, make sure that the MYHD card is not in a slot sharing the same interrrupt level with the AGP slot. Check you mobo manual for shared interrupts.


Vern


----------



## Briands

Quote:

_Originally posted by Vern Dias_
*I would suspect it's the other way round, ie, MYHD conflicting with the 5700.


That said, make sure that the MYHD card is not in a slot sharing the same interrrupt level with the AGP slot. Check you mobo manual for shared interrupts.


Vern*
I guess it is a matter of perspective... since the MyHD card was there first and it worked fine with the Radeon 8500... But you are technically correct.


I have tried all of the slots with no luck. Guess it's lucky that my primary HD is my Motorola cable box. Maybe I'll just retire the MyHD, or move it to one of my other machines... then I can time shift across the network with a software player.


----------



## malefactor

Don't get your hopes up wrt software players. Several of us have had little to no luck trying to come up with something for this specific application in the hipix thread.


(It's more essential with the hipix because the thing is a space heater.)


short answer-few to no good solutions. If someone really understands the filters and zoomplayer, it could be set up i think. But either noone does, or noone bothers to help out the masses.


----------



## Briands

Bummer.


----------



## stylinlp

For those of you not following the Theater Tek2 threads and the threads on the HTPC forum. Here is an update on using a nVidia card.


I just learned from Andrew over at Theater Tek that the new TT2 will also use special features only available of the new nVidia 6600 and up cards when they come out next month. When the 1st 6600 comes out it will only be available as a PCI-Express card. Then a month later it will be available as an AGP card.


The newest nVidia cards will have even better video qualities than that the current nVidia 5700's and 5900's are using. TT2 will use those benefits.

This is all good if you want to stick to a hardware solution rather than a software solution (FFDshow).


----------



## stylinlp

For those of you not following the Theater Tek2 threads and the threads on the HTPC forum. Here is an update on using a nVidia card.


I just learned from Andrew over at Theater Tek that the new TT2 will also use special features only available of the new nVidia 6600 and up cards when they come out next month. When the 1st 6600 comes out it will only be available as a PCI-Express card. Then a month later it will be available as an AGP card.


The newest nVidia cards will have even better video qualities than that the current nVidia 5700's and 5900's are using. TT2 will use those benefits.

This is all good if you want to stick to a hardware solution rather than a software solution (FFDshow). To use FFDshow you must have a P4 2.8ghz or a AMD64 CPU to get the full benifit of software rendering. They are saying that if you want to stick with your current CPU/MB and wish to go the hardware route then the newest nVidia 6600 cards and up are the way to go.


----------



## deronmoped

I'm going to wait another year and see what is out in the way of video cards. Things are changing so fast, that if you do not force yourself to wait you end up with a card that has not had a chance to work all the issues out of it.


Deron.


----------



## maneuen

but then there will be theatertek 2.5, nvidia video filters 5.0, etc....and of course you must then have the latest wizbang flux capacitor driven video card to take advantage of all the new features


----------



## stoub

I tried to set up a 5700 Ultra (leadtek) in my HTPC, and I'm not pleased at all :-(


I just couldn't make it work at 1280x720 72Hz correctly with powerstrip, sometimes it worked but the resolution would go away at the next reboot... Or I set it up on vga mode, and the display would shift right on the projector, or the drivers would try to put the projector at 200 Hz...


Although with my radeon 9600SE, it works perfectly.


Did anybody have that kind of problems ? (my proj is a barco graphics 801)


----------



## mikecazzx

Quote:

_Originally posted by stoub_
*I tried to set up a 5700 Ultra (leadtek) in my HTPC, and I'm not pleased at all :-(


I just couldn't make it work at 1280x720 72Hz correctly with powerstrip, sometimes it worked but the resolution would go away at the next reboot... Or I set it up on vga mode, and the display would shift right on the projector, or the drivers would try to put the projector at 200 Hz...


Although with my radeon 9600SE, it works perfectly.


Did anybody have that kind of problems ? (my proj is a barco graphics 801)*
1280 x 720 x 72 is one of the supported timings - I am NOT using Powerstrip on mine so far. The driver handles this timing


----------



## stoub

Well without powerstrip, when I set 1280x720, the projector just blanks and doesnt recognize the signal anymore... Even in 640x480, sometimes it works, but sometimes it just slides to the side, or just doesnt work at all...


----------



## usabrian

Ok, this is my mini-review after only a night with the PNY Geforce 6800 GT.


My previous experiences were with an ATI Radeon 9000 with my last htpc. Then I purchased a 5700 Ultra for use with my new p4 3.2gig htpc and got the picture through overlay just gorgeous. My reference DVD is Moulon Rouge. Incredible reds on that disc and the 5700 was stunning. But, it was unable to run vmr9 at all and vmr7 had significant tearing. So, I had planned to upgrade to the 5950 when sticker shock kicked in (I could not believe it was $500). That card is the one highly rated by Mike Parker. In the end I picked up the 6800 at Fry's yesterday for $399.


Right away I noticed that the desktop was not as clear as the 5700 Ultra. While colors were beautiful and on par with the 5700 and better than my Radeon, movies were noticeably softer than the 5700. Buyers remorse was really kicking in here as I was pumping resize to 1920X1920 and 1920X1440 as well as denoise3d after and the picture was still soft! I was able to play vmr9 to my hearts content without any hassle by the way and without any tearing but I find I prefer the overlay whenever I compare. More testing is in order though.


I played around with the various settings and I updated to the latest nvidia approved drivers. It was not until I found the sharpness control under "desktop color panel" that I noticed something. If I slide the sharpness control just a smidge off default the picture "snaps" and really comes to life with no negative effects of sharpness that one would normally anticipate. I assume right now that the default sharpness actually blurs the image so I recommend that people move it just slightly off the lowest setting like I did. I do not remember if I had done this previously with the 5700 so this may have been the case with that card as well. This makes comparisons difficult now. But either way, both cards blow away my ATI 9000 in terms of sharpness, clarity, noise and especially color.


Right now I do not know if the 6800 was any better with dvd than the 5700. I have read elsewhere that the NVIDIA folks were particularly proud of the 5700 card's clarity. I was definitely impressed with it using either Sonic or Windvd6 and 1920X1440 lanczos and denoise3d. Since its $250 extra that means I may take it back and get the 5700 again as I am not a huge gamer . But I like the fact that the 6800 can run anything I can throw at it whether that be anygame, vmr9, wmv or whatever.


There is one thing I have noticed with both cards that I do not fully understand. When I setup the ffdshow and play the dvd after a certain period of time the screen changes and becomes brighter with different gray levels. It is as if either ffdshow loses control or the geforce cards take over control. Its almost like the picture with vmr9. I do not understand what is happening here as its frustrating to get the picture the way you like it only to have it become brighter and washed out after a period of time. Now that I know it will happen I can set it right ahead of time but it really throws me when it happens. FYI, I am also using powerstrip but dont know if that has anything at all to do with it. Comments here are really welcome.



Thanks, Brian


----------



## jcmccorm

Hey Brian, sorry I can't comment on your problem, but thanks for your review!


Cary


----------



## deronmoped

Radeon has a card in the works that blows away these other cards. It's due out soon.


Deron.


----------



## VideoGrabber

usabrian commented:

> the 5700 was stunning. But, it was unable to run vmr9 at all and vmr7 had significant tearing.


----------



## genmax

Still running Nvidia ForceWare 3x "beta" using VRM9 with ZP and lovin it!


5700 non ultra (switched ultra into gaming PC) on a 2.0 gig P4 (shuttle XPC)


Best picture I've had ever.


----------



## Briands

Any new updates on the Nvidea front? Unfortunately I bought the 5700 a month ago. Looks like I should have waited for the 6xxx series... It may have worked with MyHD card as well.


BTW, I installed the Arctic Cooler and on my chaintech card it needed some additional space on the back to prevent contact with some little componants.


Also, using the drive (not sure the number) I am able to adjust the timing to 71.295 as well as adjusting porches, etc. No need for Powerstrip!!!


----------



## genmax

I just recently dropped PowerStrip as well since everything you need is in the new NVidia drivers.


----------



## MikeReilly

Comparing a 6800 to a Radeon 9000 is like comparing a Ferrari to a Volkswagen. My 9800 XT blows away my 9000 as well, but I'd expect that since the 9000 is a few generations of GPU in the past.


----------



## maxse

guys fot a HTPC is there a big difference between the 5600non-ultra and the 5600ultra. In terms of movie using the tv-out?


----------



## Graham Johnson

Briands,


I have a 5700 ultra, and have to have power strip loaded much as I dont want to.


Where are the porch adjustments on the Nvidia driver ???


I have looked high and low for them.


I can move the picture around the screen to centre it, but cant find porch adjustment to resize it.


HELP


----------



## Briands

I'll have to take a look when I get home, but it seems the menu selection "analog display" opened a box with an "Advanced timing" button. In there you will find many settings similar to powerstrip.


----------



## VideoGrabber

stoub wrote:

> Well without powerstrip, when I set 1280x720, the projector just blanks and doesnt recognize the signal anymore...


----------



## Rickd

"For those of you not following the Theater Tek2 threads and the threads on the HTPC forum. Here is an update on using a nVidia card.


I just learned from Andrew over at Theater Tek that the new TT2 will also use special features only available of the new nVidia 6600 and up cards when they come out next month. When the 1st 6600 comes out it will only be available as a PCI-Express card. Then a month later it will be available as an AGP card.


The newest nVidia cards will have even better video qualities than that the current nVidia 5700's and 5900's are using. TT2 will use those benefits.

This is all good if you want to stick to a hardware solution rather than a software solution (FFDshow). To use FFDshow you must have a P4 2.8ghz or a AMD64 CPU to get the full benifit of software rendering. They are saying that if you want to stick with your current CPU/MB and wish to go the hardware route then the newest nVidia 6600 cards and up are the way to go."


As per the above has any testing been done on this card yet with mp-1.


If you are using software rendering ie fddshow then presumably 5900 still the best option?


Where can I buy a mp-1 for the new card?


----------



## mp20748

Quote:

_Originally posted by Rickd_

[B


As per the above has any testing been done on this card yet with mp-1.


If you are using software rendering ie fddshow then presumably 5900 still the best option?


Where can I buy a mp-1 for the new card?


[/b]
Yes, so far the MP-1 has been attached to the following Nvidia:


- 5700


- 5950


- 6800


The marriage has been a very good one. The cards have exceptional DAC's on them. The listed Nvidia as well as a Matrox Parhalia are the only video cards that I've been able to test, that would properly handle the very difficult bandwidth requirement of true HDTV 1080P resolution. The ATI's will do as well, but will not maintain the same low noise floor at the higher scan rates. So far I've not looked at the newer ATI's (I have one coming my way), and have only tested the 9500, 9800's for this performance factor. These cards will clearly outperform any stand-alone processor that I've had a chance to play with. I've yet to see a stand-alone processor truly handle 960P - but I've not looked at the multitude of units out there.


My latest mod has not been released, it's only in the hands of a few people so far. The cost will not change.


The newer cards have increased in performance, likewise the newer MP-1v3 makes a perfect companion for that extra performance of these cards. So much so, that I'm trying not to toot my own horn. So I'm looking forward to those who now have the MP-1v3 attached to either Nvidia or ATI cards to say what they have experienced. I'm now waiting to hear what the end user has to say


----------



## jimwhite

"I'm mike parker - and I approve this message"


 ROTFL


----------



## Chuchuf

I have one of MP's mods on a 5700 (will change to a 6800 or 6600 soon) and I can say that I'm very impressed with it.

I haven't had much time to post on this or really even test the card that much until the past two weeks but I have had this for a few month and have een using it for the past month.


Terry


----------



## VideoGrabber

I'm not sure who Rick was quoting:

> They are saying that if you want to stick with your current CPU/MB and wish to go the hardware route then the newest nVidia 6600 cards and up are the way to go. These cards will clearly outperform any stand-alone processor that I've had a chance to play with. I've yet to see a stand-alone processor truly handle 960P - but I've not looked at the multitude of units out there.


----------



## JBJR

Tim,

Mike has looked at the Lumagens and I believe his above statement stands. I've been there for some of his testing.


Hi Mike,

Been getting pretty busy for me up here, that time of the year you know. I'll give you a call tomorrow.


John


----------



## VideoGrabber

JBJR wrote:

> Mike has looked at the Lumagens and I believe his above statement stands. I've been there for some of his testing.


----------



## mp20748

Quote:

_Originally posted by VideoGrabber_
*


I should mention that the old Visions (which I have) are somewhat bandwidth limited beyond 840p (but still look very good at 960p), and are exceeded by Lumagen's newer units.


Thanks for the update.


- Tim*
That's about right. Somewhere right before 960P. But the earlier Vision from Lumagen was one of my personal best. The Lumagen rep left one with Makrk Haflich to demo. We tested it and found it to be a very nice unit. It did everything well, it handled the test patterns flawlessly.


It's also a very well designed unit. They chose to use a clean power supply and buffer the analog out from the DAC. Adding the buffer to the DAC was a very good thing to do, as all DAC's should have buffers (and/or line drivers) for long (10'>) cable runs. It's a jewel of a scaler, that works exceptionally well when used properly. Mark now owns the later SDI version. I've not played with that one, but I'm pretty sure that SDI input would make better for a cleaner 960P conversion. I just don't think it's a good idea to scale an analog signal to 960P. And it's not only the Lumagen that has the roll-off near 960P, everything else I've looked at also roll-off before 960P. The exception is the non analog (SDI, HTPC) units that does a much better.


But for DVD and regular video, the Lumagen does very well, as with so many other scalers. The problem for these units is when/if the intent is to exceed 960P, and this is where I was going earlier with the discussion on 1080P for HDTV. Most of the better video cards will scale DVD to 1080P with ease (for those who watch DVD @ 1808P), but 1920x1080P HDTV is a whole different beast to contain. For the high end 9" CRT owners that are using HTPC cards for scalers, they should make sure they have a good engine in that HTPC for this HDTV standard, because 1080P HDTV is right around the corner.


1080P HDTV movie samples can be downloaded off the internet. But in order to experience those samples in it's fullness, everything in the video chain has to be at a super perfomance level. And yes, specs do lie. For instance, how is it that every brand, make model of cable being sold have a cable capacitance in the window of 15pf to 17pf. The better rated cables (RG6) are listed at or near 17pf. The cheap and very thin junk has the same figures 


Plus the average spec for a projector list the bandwidth of that projector, and that bandwidth is determined by the maximum scan rate of the projector. It has nothing to do with the video chain. Actually the manufacturer should make sure that the projectors video chain (bandwidth) match the maximum scan rate of the projector. in other words, the projector will sync-to very high scan rates, but the video circuit won't properly resolve the video bandwidth at that scan rate 


I've found one thing out over the past two weeks playing around with HDTV. It gonna need a lot of headroom, to really look good. Enough is not enough.


----------



## mikecazzx

Quote:

_Originally posted by mikecazzx_
*Ok I think I finally have read all of this post.


Looks like a 5700 Ultra will replace my Radeon soon - if its affordable.*
Following up on this post.


I replaced it with the PNY Verto Geforce 5700 ULTRA agp @ 128mb DDR2 ram.


I am very happy with the card and I may even use another one to replace my main work pc.


----------

