Fiber optic HDMI cable

Status
Not open for further replies.
Joined
Jun 27, 2007
Messages
546
Location
Near Mobile
After reading reviews I decided to get one to connect my Roku to my TV. A little expensive, but I like the results. I think the video is clearer and there is better sound especially on sites like Accuradio.
 
Do a double blind test.
laugh.gif
 
Digital is digital, if the 1's and 0's can get there uninterrupted, you aren't going to be able to discern a difference.
 
Originally Posted by OVERKILL
Digital is digital, if the 1's and 0's can get there uninterrupted, you aren't going to be able to discern a difference.

Yep.
 
Originally Posted by OVERKILL
Digital is digital, if the 1's and 0's can get there uninterrupted, you aren't going to be able to discern a difference.

But, but, but... it was expensive, so it must be better.
smile.gif
 
Originally Posted by eljefino
Do a double blind test.
laugh.gif


Originally Posted by OVERKILL
Digital is digital, if the 1's and 0's can get there uninterrupted, you aren't going to be able to discern a difference.

Originally Posted by Quattro Pete
Originally Posted by OVERKILL
Digital is digital, if the 1's and 0's can get there uninterrupted, you aren't going to be able to discern a difference.

But, but, but... it was expensive, so it must be better.
smile.gif



+1, IBTL. Plus, there's got to be a converter to go from copper to optical, right? I'd argue that means at best you'll only match the performance of a good traditional HDMI cable, and can't actually improve anything. Any conversion or additional connection introduces some noise/artifacts somewhere down the line. Unless you're running insane distances from source to TV, there's no benefit.
 
Originally Posted by ffhdriver
and there is better sound especially on sites like Accuradio.

Considering that Accuradio streams at 24-32 kbps, the quality can't be all that great, regardless what cable you use.
smile.gif


As far as free radio apps on Roku, I think Spotify and TuneIn offer better quality.
 
I dont get it. HDMI runs on copper. Fiber optic requires an optical digital signal. So you need to go copper to light, fiber optic HDMI, then light back to copper.

Two conversion stages gives MORE chance for errors.

It would likely be great for long runs or spots where fibers give an advantage. But to connect two close by devices, seems iffy.
 
Originally Posted by JHZR2
I dont get it. HDMI runs on copper. Fiber optic requires an optical digital signal. So you need to go copper to light, fiber optic HDMI, then light back to copper.

Two conversion stages gives MORE chance for errors.

It would likely be great for long runs or spots where fibers give an advantage. But to connect two close by devices, seems iffy.


They are primarily designed for long runs as per your posit. There's an optical converter on both ends of the cable, powered by the device it's attached to.
 
Originally Posted by wag123
Originally Posted by OVERKILL
Digital is digital, if the 1's and 0's can get there uninterrupted, you aren't going to be able to discern a difference.

Yep.

People said that about CD when they came out.

most CD players** for over a decade did not meet S/P white paper spec for noise and distortion for a 16 bit 44khz sample.

They had high error correction and concealment rates.

Time splitting is analogue and its a BIG part of digital data transfer

Jitter is a data killer.
______________________

I'm returning my Roku Ultra LT. ( if they cant steal my info off the chipset !)

It's got a poor picture engine - or it doesn't like my SAMSUNG LED

Visible motion smudging and smearing.

Flat facial images that don't look real or POP .

** the Philips Magnovox FD1000 SL was a good exception.
 
Originally Posted by ffhdriver
After reading reviews I decided to get one to connect my Roku to my TV. A little expensive, but I like the results. I think the video is clearer and there is better sound especially on sites like Accuradio.



Likely more at galvanic isolation

Too many different GND references going on.

Always tricky with outside cable coming in and too many gizmos plugged in all at different chassis bias
then add in noisy switching supply wall warts

I move stuff around to different power strips, and all %#*&& breaks loose with my audio or video.

Need to read "Zen and the Art of Digital Video Maintenance".
 
Bought our first big tv at Circuit City (remember them?) and the salesman asked me why would I spend 1300 on a tv and not buy the $100 HDMI so we could get all the wonderful goodness from the tv? Fell for it, didn't know any better.......
 
Originally Posted by rekit
Bought our first big tv at Circuit City (remember them?) and the salesman asked me why would I spend 1300 on a tv and not buy the $100 HDMI so we could get all the wonderful goodness from the tv? Fell for it, didn't know any better.......

My old company threw out 6meter HDMI pro-grade cable by the dumpsterfull that came with the large monitors we rebuilt for
"special" customers

I saved a few from landfill.

I have one on my TV now. Works excellent compared to a Mid grade Monster or a flexible but skinny G.E.
 
Don't feel bad. All HDMI cables were $100 back them. Or you could use component cables
frown.gif
.

Electrical Engineer here. All of the responses are correct. If I were to design such a product would only be for a long run where the two pieces of equipment are on different circuits where the ground potential could be different. I'm sure there are some IC optocouples/digital isolatiors that can accomplish the same task without the fiber.

HDMI errors are obvious and seen as blatant random garbles in the audio or video. The are not subtle analog qualities like hum, hiss, or fuzzy/grainy picture. That what I love/hate about digital. A so-so analog signal is still very watchable with a little "snow" in the picture and hiss in the audio. a so-so digital is very annoying with ugly pixilation and complete dropouts.

If I were unscrupulous enough to make some fiberoptic HDMI cables that "make the picture better", I would add audio processing that received the original digital audio and apply some digital filtering/equalization to it to increase the bass and treble before sending it on to the TV. A-B test would show an increase in those area and folks would think its better. You can do the same thing with your TV.
smile.gif
I might even put BOSE on it. :p The same thing could be done with the video data. slight sharpness increase or contrast boost and there it is. easy A-B comparison winner. The difference is REAL! LOL.
smile.gif
 
d_y

Good response''

spend a few decades at Bell Labs.

Last job was helping production set up to build and repair fiber optic long haul "repeaters"
that did not do glass to copper to glass in the amp chain.

Magic stuff

Yes I am pretty much a key cog in building up the digital long haul infrastructure though Lucent Tech and AT&T.

.... Pats self on back and dislocates shoulder ....
smile.gif
 
Originally Posted by ARCOgraphite
Originally Posted by wag123
Originally Posted by OVERKILL
Digital is digital, if the 1's and 0's can get there uninterrupted, you aren't going to be able to discern a difference.

Yep.

People said that about CD when they came out.

most CD players** for over a decade did not meet S/P white paper spec for noise and distortion for a 16 bit 44khz sample.

They had high error correction and concealment rates.

Time splitting is analogue and its a BIG part of digital data transfer

Jitter is a data killer.
______________________

I'm returning my Roku Ultra LT. ( if they cant steal my info off the chipset !)

It's got a poor picture engine - or it doesn't like my SAMSUNG LED

Visible motion smudging and smearing.

Flat facial images that don't look real or POP .

** the Philips Magnovox FD1000 SL was a good exception.





A CD player is reading digital data and typically converting it to analog unless you are piping it to a separate DAC. We are discussing a simple transport medium for digital data with no intermediate processing steps; vastly more simple. It's like saying that one Toslink cable that's $500 is better than one that's $30 even if they are byte-identical at the endpoint. It doesn't work that way. The idea that a "better" cable will improve the experience with digital media if the data is already getting there properly is simply capitalization on ignorance and the the legacy of high end analog cables where EMI and other sources of "noise" could in fact be mitigated with better, and more isolated, mediums.
 
Originally Posted by danez_yoda
Don't feel bad. All HDMI cables were $100 back them. Or you could use component cables
frown.gif
.

Electrical Engineer here. All of the responses are correct. If I were to design such a product would only be for a long run where the two pieces of equipment are on different circuits where the ground potential could be different. I'm sure there are some IC optocouples/digital isolatiors that can accomplish the same task without the fiber.

HDMI errors are obvious and seen as blatant random garbles in the audio or video. The are not subtle analog qualities like hum, hiss, or fuzzy/grainy picture. That what I love/hate about digital. A so-so analog signal is still very watchable with a little "snow" in the picture and hiss in the audio. a so-so digital is very annoying with ugly pixilation and complete dropouts.


Awesome post
thumbsup2.gif


Yes, exactly, digital data errors are obvious. If the information is able to properly make the trek, whether it's a $500 or $5 cable is irrelevant.

Originally Posted by danez_yoda
If I were unscrupulous enough to make some fiberoptic HDMI cables that "make the picture better", I would add audio processing that received the original digital audio and apply some digital filtering/equalization to it to increase the bass and treble before sending it on to the TV. A-B test would show an increase in those area and folks would think its better. You can do the same thing with your TV.
smile.gif
I might even put BOSE on it. :p The same thing could be done with the video data. slight sharpness increase or contrast boost and there it is. easy A-B comparison winner. The difference is REAL! LOL.
smile.gif



Nailed it! You'd be 100% on that "perceived benefit" as the result of intentionally altering the data stream, you'd have folks lauding your cable like it was the second coming!
lol.gif
 
Originally Posted by JHZR2
I dont get it. HDMI runs on copper. Fiber optic requires an optical digital signal. So you need to go copper to light, fiber optic HDMI, then light back to copper.

Two conversion stages gives MORE chance for errors.

It would likely be great for long runs or spots where fibers give an advantage. But to connect two close by devices, seems iffy.

It's good marketing. Just like gold-plated connectors being "better". Maybe electrically (measured with lab equipment, it is) but discernible by human ears ? Doubtful....
 
Status
Not open for further replies.
Back
Top