We realize that a lot of viewers have strong beliefs about the infallibility of digital signals. That idea may be well founded. Or not. We’re listening to this product to see if we can hear that it does anything that a basic network switch does not. Because we have found time after time that listening offers information that theories sometimes do not. If you are open to the possibility that errors in digital transmission are possible, we invite you to read this white paper which explains why all 1s and 0s might not be equal: www.maldesigns.ca/pdf/UpTone-J.Swenson%20EtherREGEN%20%27white%20paper%27.pdf The paper offers 3 theories actually, which do nothing more than suggest to us that some empirical evaluation might be warranted.
"We realize that a lot of viewers have strong beliefs about the infallibility of digital signals. That idea may be well founded. Or not. " It depends on the implementation. You can define digital transmission which having a lot of data errors BUT in the real world we are working with well known transmission protocolls and technics like Ethernet and it is not a question of belief how high or low the error rate is when data is transmitted using this technology as the maxium error rate is defined by the Ethernet standard and could be measured also. So for example when you transmit data between two switches or routers a single bit error is allowed in a amount of data which correspons to about 100 Audio CDs. Under normal circumstances the error rate is even much lower so you would have to transmit 1000 Audio CDs to get a single bit wrong. But it gets even more "crazy" - when using TCP/IP is used on top of Ethernet this very rare errors are detected and the receiver requests the sender to send that packet which contains that error again. So as long as not a cable has broken or your switch has a defect you not have to worry about transmission errors at all....and that is well founded by hundreds of millions of emails and other data transmittted over the Internet every hour. Did you ever got and email where some letter in the email was wrong because of a transmission error? Did you ever heard about such a thing? The answer is no - because pratically it never happens even when an email is send half around the earth.... But "audiophiles" are worrying if such errors occur in the 10 meter long network in their living room..... It is totally nonsense!
I read the document. And as many have pointed out, it is nonsense. The fallacy is in the first line of your remark. This not about the digital signal. TCP/IP does not work this way. And clocks and jitter sound all nice and technical, have relevance when it comes to feeding data to the DAC, but don't apply to TCP/IP. A little background, I wrote a book on performance testing more than a decade ago. I won't link to it as I am not here to promote the book. But in it one of the main things in the book is explaining that computers either run at 100% or are standing stil. If your CPU measurements says it is running at 60% people think of it as car that is running at 60% of its maximum speed. It is not, it is running at full speed for 60% of the time. If anyone wants to read more check out en.wikipedia.org/wiki/Queueing_theory This similar to how TCP/IP works in this respect. Most consumer network devices these days work at 1Gigabit per second (1 Gbps) This is a lot more data per second than the bitrate of even the highest quality lossless music file. Even if you have older devices still running at 100 MBps it is still much faster. The network does not slow down to the bitrate of the music. It spits out the data at 1 Gbps. It just runs this until the client device buffer is full and it temporarily stops asking for more data. Why is then often called streaming? That is to indicate that the client device that gets the music file from i.e. the NAS or some streaming service, start playing the music even before the entire file has been received. So if the buffer is large enough the entire music file can be in the client device in seconds. You can verify this by disconnecting the network cable during playback, the song can continue to play for seconds or even longer. How would that work with your audiophile grade clocks in switches?? The only clock that can have any effect is the one in the client device. It sends data in packets (not signals!) In TCP/IP the packets don't even have to arrive in the right order.. If a packet doesn't arrive intact it will be requested again. The data then acts similar to as if the file was on storage in the client device which is sending the data to the DAC. It doesn't matter how good or bad the 'signal' is that is transporting the data (the packets) to the client device, as long as it is not that bad that it the buffer runs empty. It doesn't matter how bumpy the ride was. As long as it arrived in tact and before the buffer runs empty. You can think of it as a funnel. It doesn't matter for the stream leaving the funnel if the supply is steady or irregular, as long as the funnel doesn't run empty the stream will continue. If the buffer does run empty even the untrained listener will hear this. This will not be about 'sound stage' or 'separation of instruments' These will be very clear artifacts similar to a CD that is so damaged that error correction can't fix it. What about noise and 'leakage current'. If any of this happens to come from the network switch you need a better client device. This should never be able to actually make it to the analogue stage. Well anyone even a little bit familiar how the brain works knows how well it is able to fool you. Do I trust your hearing or physics and measurements. Occam's Razor tells me not to trust someones hearing over physics and measurements. If you want a real test: do a proper blind test. A proper blind test is where you take a normal switch and an 'audiophile grade' switch. Listen to them both until you think you can tell the difference. Then have someone trustworthy make you listen to both of them at least 10 times in random order (so in total 20 times) obviously in such a way that you can't see or otherwise know what you switch is being used. All you have to do each time is tell each time if you are listening to the cheap one or the audiophile grade one. This is important stuff. I, like many others looked at reviews from respected names to help me figure out what are good quality products. Only to find out that these respected names 'with the golden ear' claimed all sorts of nonsense on how different network switches sounded. As a result, I really stopped trusting them. I would say: for me the veil has been lifted: their reviews are utterly subjective and not to be trusted.
"We realize that a lot of viewers have strong beliefs about the infallibility of digital signals." A lot of viewers? If i read the comments more than 95% do not believe that you are claiming.
I'm not even gonna read a document hosted on a site that sells everything based on audio wankery buzzwords. Short answer to this question, is it's not infallable. It either gets there or it don't. It's buffered and error corrected anyway, so the original file will ALWAYS make it even if error correction had to be used and the buffer kicked in.
Network Engineer here and also A/V enthusiast. These products offer no data to backup their claims. Ethernet is an asynch, frame/packet based, highly buffered, high speed packet pusher. Let's cut through all the B.S though. I'll giver you 10:1 $$ that on resolving headphone setup that you can't tell the difference in switching.
4:55 "first temperature controlled and compensated crystal oscillators so the oscillators are what determine the timing of the fundamental signals in the switch and you want them temperature compensated because obviously temperatures vary in your home and temperatures vary inside the box as it's being used and heated up and so on" Quarz based oscillators are very precise already (thats why they were used in most digital watches) and yes - the thing that impacts them most (but still on a very low level) is the temperature but at your home you typically not have huge changes in temperature (more than 20° celsius) - so the impact is very small. But the real point is that the mentioned "timing of the fundamental signals" has allowed tolerances in Ethernet and as long as as you are in this tolerances everything works perfect and a standard quarz oscilliator is more than accurate enough to stay in these tolerances. Think about the morse code which consists of dots and dashes (short beeps and long beeps). Per definition a dash(long beep) must have a duration three times as long as a dot (short) beep but humans using a button to create a more signal are not 100% accurate and therefore the duration of a dash might be a little bit less (or a little bit more) than three times of the duration of a dot. Does that cause errors in the transmitted data? No - even when you are sending a "dash" with is only 2.1 times as long as a "dot" the receiver could still identify it as a dash and if a "dot" is 1.4 times longer than defined the receiver could still determine it as a "dot"... The same applies for signals on Ethernet - there are tolerances and as long as you are working in these tolerances everything is fine and there is no need to use a "high-precision" temperature compensated quarz oscillator to ensure that the signal are inside the allowed tolerances.
Jitter only matters at the D-A conversion. Prior to that, digital signal bits are interpreted as either 0's and 1's by sampling them well after the voltage transition, where the levels have settled as unambiguously high (1) or low (0). Some digital errors occur, but there are many mechanisms to ensure signal integrity. Jitter in the Ethernet stream is not the same as jitter that distorts the analog signal. Noise in the DAC, where jitter does matter, is much lower than in the Ethernet stream and there are various techniques to sample during D-A conversion without introducing relevant jitter.
12:21 but suffice it to say there is good technical reason to believe that jitter or timing errors matter in the ultimate recreation of the analog signal" Yes - jitter or timing errors (it is the same) matter at the point there the digital data is converted to an analog signal. But this happens not in an Ethernet switch - it is done in a streamer 12:35 hence we care about how we start and how much jitter we start with" No - we not care about the jitter at the point the signal chain starts (which might be a server 10000 km away) - we care about the jitter at the end of the digital signal - so just before the DAC. Each streamer has (and needs to have) a buffer the a couple of seconds or even minutes of music data is stored. This buffer is filled by the data received over Internet - and readen from the buffer with a speed defined by the clock in the streamer. So it is the clock in the streamer which determines the jitter - not the clock in a switch before the streamer. The only thing that must be ensured is, that this buffer never runs empty (which would cause an interruption in playback) - so the internet connection must be able to deliver the data faster than needed for playback. This is normally not a problem with todays internet connections. If it is a problem there is not way an "audiophile switch" can fix this. And that is the reason why none of the manufactures of such devices are proofing their claims by either correct measuring or double blind tests. 12:41 and that's why we're interested in products like this english electric 8 switch The reason is that sellers earn a lot of money with this overpriced voodoo stuff and some customers with limited technical knowledge believe they can improve the sound quality by bying this stuff.
3:42 the first that i'd like to talk about is the fact that timing in digital signals is very important. If you do a little bit of study on how digital signals work the element that we typically think of as frequency is to some degree embodied in the spacing or the timing of the zeros and ones that make up the digital signal therefore timing is very important If you really would have done some "study on how digital signals work" you would know that there are many different protocolls to transmit digital data there some of them have "critical" timing and others there timing does practically not matter at all. Customers need switches which are transfering data without errors - and the industry provide such switches which can do that. If "timing" would be a real problem - tell me why all that Excel-files, pictures, movies could be transmitted through a normal switch without causing any errors but when it comes to music it "suddenly" should create a problem? A switch does not know what type of data is in the packages it does transmit and even if it would want to know that in many cases it would not be possible to find out as data is more and more encrypted while it is send other a network.
As a professional network engineer with around 35 years of experience with Ethernet and the protocols running over it, this is pure BS. Jitter at the bit level is absolutely irrelevant. The bits get reassembled into packets of up to 1500 bytes long before they get anywhere near a DAC. Software then decodes the packets into the stream of digital audio data, which is then decoded and fed into the DAC. Bit level network jitter from an Ethernet LAN cannot possibly impact the timing that's actually driving the DAC. Total BS, used to sell overpriced snake oil.
Yeah total BS... Networking equipment are engineered to transmit data without any error over a best effort network. Then the receiving end (PC, Streamer, Etc) has to buffer it sufficiently to account for the variable throughput of the incoming data. Then, you just send this lossless data to the DAC. Best way is using Async USB which will result in the DAC using its own clock to regenerate the analog signal. And unless you have a seriously broken DAC, the DAC's clock jitter is just noise well below human hearing threshold... ruclips.net/video/TT9JL2yaIOA/видео.html
As a former network engineer, I call bullshit. Error correction ensures the data is sent correctly. Trillions of bits of data are sent over Ethernet networks daily with websites, emails, etc all transmitted bit perfect from source to destination. Enough noise could potentially slow the data transfer rate, but the bitrate of streamed music is so far below the maximum transfer rate of an Ethernet switch that it would take a massive amount of noise before you would ever notice. Even then, any streaming device I've ever dealt with would just pause the music until more data came through.
You obviously don't know how internet works and how TCP works in each step of the way from the source to the receiver. The audio quality only depends on the quality of the DAC, not at all on the Internet!
BS. Ethernet is packet based with checksums. The only way your going to lose quality is if you start dropping packets. And before you go into jitter, the audio is contained in the packet. As long as the packets arrive in order and get decoded without errors you can't physically tell. Sure analog good quality cables make a difference. Digital doesn't matter as long as it meets the specifications. Give me some hard numbers and maybe I'll change my mind
I own the English Electric 8Switch, but before that I was running everything that's connected via ethernet directly from my Orbi Router. My System sounds and looks good even before this purchase, but this thing is Next Level. I can't comment on the technicals because I know nothing about those details, but as a Musician that's been playing analogue instruments for 45 of my 58 years on earth, this thing makes my system sound more like real music, instead of some computerized fascimile of the real thing and the TV Picture went from great to outstanding. The best investment on picture and sound that I've ever made.
I had the EE 8 switch. It done nothing in my system. What did make a difference was Quantum science fuses. Game me the sound I was looking for. Although expensive for fuses they saved me about 5k on a new dac.
Sorry - but you should not make such videos without at least a basic understanding how Ethernet and TCP/IP are working (which you obviously not understand)
Audiophile switches are probably the biggest bs I've heard since premium HDMI cables that somehow should improve the image. You can literally check the accuracy of the transmitted bits (not that it's needed, TCP protocol does that on his own) by using a packet capture software like Wireshark and verify that there is absolutely no difference between what's transmitted by a normal switch and a stupidly overpriced "audiophile" switch. The fact this video isn't disliked to hell is kinda saddening.
Being a computer engineer only deters your credibility. There is tons good scientific data that proves this. You harp on one thing and ignore other things. Who generated cognitive dissonance causes deafness.
We realize that a lot of viewers have strong beliefs about the infallibility of digital signals. That idea may be well founded. Or not. We’re listening to this product to see if we can hear that it does anything that a basic network switch does not. Because we have found time after time that listening offers information that theories sometimes do not. If you are open to the possibility that errors in digital transmission are possible, we invite you to read this white paper which explains why all 1s and 0s might not be equal:
www.maldesigns.ca/pdf/UpTone-J.Swenson%20EtherREGEN%20%27white%20paper%27.pdf
The paper offers 3 theories actually, which do nothing more than suggest to us that some empirical evaluation might be warranted.
"We realize that a lot of viewers have strong beliefs about the infallibility of digital signals. That idea may be well founded. Or not. "
It depends on the implementation. You can define digital transmission which having a lot of data errors BUT in the real world we are working with well known transmission protocolls and technics like Ethernet and it is not a question of belief how high or low the error rate is when data is transmitted using this technology as the maxium error rate is defined by the Ethernet standard and could be measured also.
So for example when you transmit data between two switches or routers a single bit error is allowed in a amount of data which correspons to about 100 Audio CDs.
Under normal circumstances the error rate is even much lower so you would have to transmit 1000 Audio CDs to get a single bit wrong.
But it gets even more "crazy" - when using TCP/IP is used on top of Ethernet this very rare errors are detected and the receiver requests the sender to send that packet which contains that error again.
So as long as not a cable has broken or your switch has a defect you not have to worry about transmission errors at all....and that is well founded by hundreds of millions of emails and other data transmittted over the Internet every hour.
Did you ever got and email where some letter in the email was wrong because of a transmission error?
Did you ever heard about such a thing?
The answer is no - because pratically it never happens even when an email is send half around the earth....
But "audiophiles" are worrying if such errors occur in the 10 meter long network in their living room.....
It is totally nonsense!
I read the document. And as many have pointed out, it is nonsense. The fallacy is in the first line of your remark. This not about the digital signal. TCP/IP does not work this way. And clocks and jitter sound all nice and technical, have relevance when it comes to feeding data to the DAC, but don't apply to TCP/IP.
A little background, I wrote a book on performance testing more than a decade ago. I won't link to it as I am not here to promote the book. But in it one of the main things in the book is explaining that computers either run at 100% or are standing stil. If your CPU measurements says it is running at 60% people think of it as car that is running at 60% of its maximum speed. It is not, it is running at full speed for 60% of the time. If anyone wants to read more check out en.wikipedia.org/wiki/Queueing_theory
This similar to how TCP/IP works in this respect. Most consumer network devices these days work at 1Gigabit per second (1 Gbps) This is a lot more data per second than the bitrate of even the highest quality lossless music file. Even if you have older devices still running at 100 MBps it is still much faster. The network does not slow down to the bitrate of the music. It spits out the data at 1 Gbps. It just runs this until the client device buffer is full and it temporarily stops asking for more data.
Why is then often called streaming? That is to indicate that the client device that gets the music file from i.e. the NAS or some streaming service, start playing the music even before the entire file has been received.
So if the buffer is large enough the entire music file can be in the client device in seconds. You can verify this by disconnecting the network cable during playback, the song can continue to play for seconds or even longer. How would that work with your audiophile grade clocks in switches?? The only clock that can have any effect is the one in the client device.
It sends data in packets (not signals!) In TCP/IP the packets don't even have to arrive in the right order.. If a packet doesn't arrive intact it will be requested again. The data then acts similar to as if the file was on storage in the client device which is sending the data to the DAC. It doesn't matter how good or bad the 'signal' is that is transporting the data (the packets) to the client device, as long as it is not that bad that it the buffer runs empty. It doesn't matter how bumpy the ride was. As long as it arrived in tact and before the buffer runs empty. You can think of it as a funnel. It doesn't matter for the stream leaving the funnel if the supply is steady or irregular, as long as the funnel doesn't run empty the stream will continue.
If the buffer does run empty even the untrained listener will hear this. This will not be about 'sound stage' or 'separation of instruments' These will be very clear artifacts similar to a CD that is so damaged that error correction can't fix it.
What about noise and 'leakage current'. If any of this happens to come from the network switch you need a better client device. This should never be able to actually make it to the analogue stage.
Well anyone even a little bit familiar how the brain works knows how well it is able to fool you. Do I trust your hearing or physics and measurements. Occam's Razor tells me not to trust someones hearing over physics and measurements.
If you want a real test: do a proper blind test. A proper blind test is where you take a normal switch and an 'audiophile grade' switch. Listen to them both until you think you can tell the difference. Then have someone trustworthy make you listen to both of them at least 10 times in random order (so in total 20 times) obviously in such a way that you can't see or otherwise know what you switch is being used.
All you have to do each time is tell each time if you are listening to the cheap one or the audiophile grade one.
This is important stuff. I, like many others looked at reviews from respected names to help me figure out what are good quality products. Only to find out that these respected names 'with the golden ear' claimed all sorts of nonsense on how different network switches sounded. As a result, I really stopped trusting them. I would say: for me the veil has been lifted: their reviews are utterly subjective and not to be trusted.
"We realize that a lot of viewers have strong beliefs about the infallibility of digital signals."
A lot of viewers? If i read the comments more than 95% do not believe that you are claiming.
I'm not even gonna read a document hosted on a site that sells everything based on audio wankery buzzwords.
Short answer to this question, is it's not infallable. It either gets there or it don't.
It's buffered and error corrected anyway, so the original file will ALWAYS make it even if error correction had to be used and the buffer kicked in.
Network Engineer here and also A/V enthusiast. These products offer no data to backup their claims. Ethernet is an asynch, frame/packet based, highly buffered, high speed packet pusher.
Let's cut through all the B.S though. I'll giver you 10:1 $$ that on resolving headphone setup that you can't tell the difference in switching.
4:55 "first temperature controlled and compensated crystal oscillators so the oscillators are what determine the timing of the fundamental signals in the switch and you want them temperature compensated because
obviously temperatures vary in your home and temperatures vary inside the box as it's being used and heated up and so on"
Quarz based oscillators are very precise already (thats why they were used in most digital watches) and yes - the thing that impacts them most (but still on a very low level) is the temperature but at your home you typically not have huge changes in temperature (more than 20° celsius) - so the impact is very small.
But the real point is that the mentioned "timing of the fundamental signals" has allowed tolerances in Ethernet and as long as as you are in this tolerances everything works perfect and a standard quarz oscilliator is more than accurate enough to stay in these tolerances.
Think about the morse code which consists of dots and dashes (short beeps and long beeps).
Per definition a dash(long beep) must have a duration three times as long as a dot (short) beep but humans using a button to create a more signal are not 100% accurate and therefore the duration of a dash might be a little bit less (or a little bit more) than three times of the duration of a dot.
Does that cause errors in the transmitted data?
No - even when you are sending a "dash" with is only 2.1 times as long as a "dot" the receiver could still identify it as a dash and if a "dot" is 1.4 times longer than defined the receiver could still determine it as a "dot"...
The same applies for signals on Ethernet - there are tolerances and as long as you are working in these tolerances everything is fine and there is no need to use a "high-precision" temperature compensated quarz oscillator to ensure that the signal are inside the allowed tolerances.
Curious to know how you got on with the box please let me know if there's a part two coming?
Jitter only matters at the D-A conversion. Prior to that, digital signal bits are interpreted as either 0's and 1's by sampling them well after the voltage transition, where the levels have settled as unambiguously high (1) or low (0). Some digital errors occur, but there are many mechanisms to ensure signal integrity.
Jitter in the Ethernet stream is not the same as jitter that distorts the analog signal. Noise in the DAC, where jitter does matter, is much lower than in the Ethernet stream and there are various techniques to sample during D-A conversion without introducing relevant jitter.
ruclips.net/video/TT9JL2yaIOA/видео.html
12:21 but suffice it to say there is good technical reason to believe that jitter or timing errors
matter in the ultimate recreation of the analog signal"
Yes - jitter or timing errors (it is the same) matter at the point there the digital data is converted to an analog signal.
But this happens not in an Ethernet switch - it is done in a streamer
12:35 hence we care about how we start and how much jitter we start with"
No - we not care about the jitter at the point the signal chain starts (which might be a server 10000 km away) - we care about the jitter at the end of the digital signal - so just before the DAC.
Each streamer has (and needs to have) a buffer the a couple of seconds or even minutes of music data is stored.
This buffer is filled by the data received over Internet - and readen from the buffer with a speed defined by the clock in the streamer.
So it is the clock in the streamer which determines the jitter - not the clock in a switch before the streamer.
The only thing that must be ensured is, that this buffer never runs empty (which would cause an interruption in playback) - so the internet connection must be able to deliver the data faster than needed for playback. This is normally not a problem with todays internet connections.
If it is a problem there is not way an "audiophile switch" can fix this.
And that is the reason why none of the manufactures of such devices are proofing their claims by either correct measuring or double blind tests.
12:41 and that's why we're interested in products like this english electric 8 switch
The reason is that sellers earn a lot of money with this overpriced voodoo stuff and some customers with limited technical knowledge believe they can improve the sound quality by bying this stuff.
Digital audio is buffered and re-clocked in the DAC. And jitter is easily measured. This video makes me question everything I see from TAS.
Exactly. Jitter is noise, unless with a very broken DAC, well below human hearing threshold. ruclips.net/video/TT9JL2yaIOA/видео.html
3:42 the first that i'd like to talk about is the fact that timing in digital signals is very important. If you do a little bit of study on how digital signals work the element that we typically think of as frequency is to some degree embodied in the spacing or the timing of the zeros and ones that make up the digital signal therefore
timing is very important
If you really would have done some "study on how digital signals work" you would know that there are many different protocolls to transmit digital data there some of them have "critical" timing and others there timing does practically not matter at all.
Customers need switches which are transfering data without errors - and the industry provide such switches which can do that.
If "timing" would be a real problem - tell me why all that Excel-files, pictures, movies could be transmitted through a normal switch without causing any errors but when it comes to music it "suddenly" should create a problem?
A switch does not know what type of data is in the packages it does transmit and even if it would want to know that in many cases it would not be possible to find out as data is more and more encrypted while it is send other a network.
As a professional network engineer with around 35 years of experience with Ethernet and the protocols running over it, this is pure BS. Jitter at the bit level is absolutely irrelevant. The bits get reassembled into packets of up to 1500 bytes long before they get anywhere near a DAC. Software then decodes the packets into the stream of digital audio data, which is then decoded and fed into the DAC. Bit level network jitter from an Ethernet LAN cannot possibly impact the timing that's actually driving the DAC.
Total BS, used to sell overpriced snake oil.
Yeah total BS... Networking equipment are engineered to transmit data without any error over a best effort network. Then the receiving end (PC, Streamer, Etc) has to buffer it sufficiently to account for the variable throughput of the incoming data. Then, you just send this lossless data to the DAC. Best way is using Async USB which will result in the DAC using its own clock to regenerate the analog signal. And unless you have a seriously broken DAC, the DAC's clock jitter is just noise well below human hearing threshold... ruclips.net/video/TT9JL2yaIOA/видео.html
As a former network engineer, I call bullshit. Error correction ensures the data is sent correctly. Trillions of bits of data are sent over Ethernet networks daily with websites, emails, etc all transmitted bit perfect from source to destination. Enough noise could potentially slow the data transfer rate, but the bitrate of streamed music is so far below the maximum transfer rate of an Ethernet switch that it would take a massive amount of noise before you would ever notice. Even then, any streaming device I've ever dealt with would just pause the music until more data came through.
I have one of these and it looks really nice. Maybe it works OK I don't know
You obviously don't know how internet works and how TCP works in each step of the way from the source to the receiver.
The audio quality only depends on the quality of the DAC, not at all on the Internet!
BS. Ethernet is packet based with checksums. The only way your going to lose quality is if you start dropping packets. And before you go into jitter, the audio is contained in the packet. As long as the packets arrive in order and get decoded without errors you can't physically tell. Sure analog good quality cables make a difference. Digital doesn't matter as long as it meets the specifications. Give me some hard numbers and maybe I'll change my mind
I own the English Electric 8Switch, but before that I was running everything that's connected via ethernet directly from my Orbi Router. My System sounds and looks good even before this purchase, but this thing is Next Level. I can't comment on the technicals because I know nothing about those details, but as a Musician that's been playing analogue instruments for 45 of my 58 years on earth, this thing makes my system sound more like real music, instead of some computerized fascimile of the real thing and the TV Picture went from great to outstanding. The best investment on picture and sound that I've ever made.
Another peddler who has no clue...
I had the EE 8 switch.
It done nothing in my system.
What did make a difference was Quantum science fuses. Game me the sound I was looking for. Although expensive for fuses they saved me about 5k on a new dac.
I found out the price and laughed. Ridiculous as it doesn’t make any difference at all. Listen to the professionals in network engineering.
Total snake oil B.S. 1’s and 0’s don’t have a sound signature.
You say streaming is the primary source of audio like that is a fact.
Absolute rubbish. It cannot make any difference because of layers of abstraction and error detection/correction. Please get properly educated.
Promoting Snake Oil shines a really bad light on any medium claiming to have any knowledge concerning hifi audio.
Sorry - but you should not make such videos without at least a basic understanding how Ethernet and TCP/IP are working (which you obviously not understand)
Audiophile switches are probably the biggest bs I've heard since premium HDMI cables that somehow should improve the image.
You can literally check the accuracy of the transmitted bits (not that it's needed, TCP protocol does that on his own) by using a packet capture software like Wireshark and verify that there is absolutely no difference between what's transmitted by a normal switch and a stupidly overpriced "audiophile" switch.
The fact this video isn't disliked to hell is kinda saddening.
Listen to one
Oh dear...
€510!!!
Lol that's cute. I just got my Innuos Phoenix Net 😂
Internet switch :)
Being a computer engineer only deters your credibility. There is tons good scientific data that proves this. You harp on one thing and ignore other things. Who generated cognitive dissonance causes deafness.