DiyMobileAudio.com Car Stereo Forum banner

1 - 12 of 12 Posts

·
Registered
Joined
·
313 Posts
Discussion Starter #1
Is there a difference in sound quality of different Bluetooth protocols or standards? I asked my son if he'd like a Bluetooth to RCA adapter for his car rather than using the AUX cable. He said he would like one. I read a review of one on Amazon and a reviewer wrote something like, "it's only Bluetooth 3.0..." and mentioned he didn't notice any sound quality issues and that it was fine for him. So then it got me wondering if there is some limitation of older Bluetooth standards.
 

·
Registered
Joined
·
60 Posts
I’ve heard the latest Bluetooth aptx protocol offers better sound quality, but don’t know much more beyond that.
 

·
Listener of Music
Joined
·
3,225 Posts
APTX is cd quality ( 352 kbps16-bit 44.1) is well, hd (576 kbps). AAC is supported by just about every recent music player.

There is new low latency codec and Sony has a great codec that has not caught on. It's in there Xperia phones.

Now you won't notice the difference in sound in a car past AAC and APTX.
 

·
Registered
Joined
·
767 Posts
That is one heck of a Good Question.

Here is the answer HERE:


New stuff as of 2020

And the real k8iller?
2020, And you are still using Qualcoms tech from 1988.

And because this is based on
 

·
Registered
Joined
·
767 Posts
That is one heck of a Good Question.

Here is the answer HERE:


New stuff as of 2020

And the real k8iller?
2020, And you are still using Qualcoms tech from 1988.

And because this is based on PC tech?
2020, and we are still down sampling audio files.


bitter truth is Bluetooth was never meant to be the Audiophile grade standard. It's supposed to be the convenience Standard. We have Wi-Fi for that and then we also had something called ultra-wide-band which even operates in the 60-Giga :hurts: band.

We got like WiDi, Mircast, and a WHOLE HOST of other protocalls with WAY more bandwidth.

Hell, I was streaming Redbook back in 1999 over wifi. Dude. Its been 20 years. Bluetooth 3.0 is just using the same part of the Audio protocol bandwidth (For most Consumer devices, not all) and packaging it like we would use the same Telephone cord, and just try to find new ways to ram more data down the same pipe.

Its almost funny that IRDA can do something like 8 channels of audio (Still can find the spec) but not Bluetooth.

Tells you something there.
 

·
Listener of Music
Joined
·
3,225 Posts
So what is says it's APTX is likely the best to use for Android and AAC is likely best for Apple due to consistency. SBC or basic BT could be as good or better but lack of common standards don't allow it to. It doesn't even mention Sony's codec which mitigates some of the technical problems he sites. Overall great article and it's inline with what I thought I knew. I didn't know the "why" until I read that.
 

·
Listener of Music
Joined
·
3,225 Posts
Your right about other tech being better for audio but that doesn't answer the op's question.

And you are still using Qualcoms tech from 1988
This argument is just plain stupid.

Did you listen to music today? Well electric speakers were invented in the 1860's and last modernized in the 1920's.

Cell phone tech was invented in the 50's, made practical in the 70's

RCA's cables were invented in the 1930's patented in 1940's modernized - well never because they still get the job done.........
 

·
Registered
Joined
·
243 Posts
Your right about other tech being better for audio but that doesn't answer the op's question.



This argument is just plain stupid.

Did you listen to music today? Well electric speakers were invented in the 1860's and last modernized in the 1920's.

Cell phone tech was invented in the 50's, made practical in the 70's

RCA's cables were invented in the 1930's patented in 1940's modernized - well never because they still get the job done.........
brutal
 

·
Registered
Joined
·
767 Posts
Your right about other tech being better for audio but that doesn't answer the op's question.



This argument is just plain stupid.

Did you listen to music today? Well electric speakers were invented in the 1860's and last modernized in the 1920's.

Cell phone tech was invented in the 50's, made practical in the 70's

RCA's cables were invented in the 1930's patented in 1940's modernized - well never because they still get the job done.........

With ALL due respect.

That's like saying the Model T is just like the new GT-40. Dude, Everything you have said is FLAT Misleading at best. I should just stop here. But I'm bored. So lets do this.

RCA cables in fact have been improved over time. I am really surprised by a man of such knowledge you did not know what has been done over the years to simple RCA interconnects. And there is a Bunch of them. One of them being the cable itself. In fact, the BNC was the direct successor. I guess Mr. Kimber would should just stop making cables.

The Cellphone? Dear god man. The original was based on Analog signals. CDMA then TDMA then the whole business on how they modulate the frequency so they can cram more data in the same bands used. That alone has changed as well. 70's? Not that I taught a CLASS about it or anything.... :) But.. Never mind..

Not to mention the CODEC"S USED. Your Phone today in your typical office no longer even is a POTS phone. Its based on VOIP. That alone has had a few iterations. From PBX to what we have now.

The ONLY reason we are still using QUALCOMS implementation is simply due to the fact that not only do they have a Patent, but they also make the best system for 99% of the people in the world. I hate to be frank about this, but the team they had was the BEST in the world. In fact, much of what is wireless is ran by not only their patented software stack, but their chips that they ALSO made from the GROUND UP.

The Single best reason why NO ONE has done better is simply they can't without breaking compatibility and thus RELIABILITY. What do I mean by that? The Bluetooth Audio standard is not just Music streaming, it also is Controls.

Bluetooth from the GET GO was made to replace Parallel and SERIAL CABLES>
NOT AUDIO


Also? Cheap Bastards don't want to pay or Fab the new stuff FROM QUALCOMM. Not only that, makers such as Apple hate to PAY things out like Patents. Not to mention having something that need to open up to Qualcomm so that they can use the standard for testing. Companies are often shy about that. BLAME CHINA.

Hence the major revisions of 2.1,3.0, and 4.0 Now, 5.0.X
But even with the new 24 Mbps data rate? -- (that's how Dayton's adapter and some others use the ENTIRE Bluetooth audio stack to get past the 3Mbps bandwidth limitation of Bluetooth audio. Not only that, they don't have to worry about other devices on that sub WiFi Channel to transmit data) -- FEW MAINSTREAM MANUFACTURES ARE USING IT.


In fact if you have more then one device? (that means just driving around, your ACC stream can go down to 125 Kbps.
Remember, it does not STREAM ACC, but it RE-ENCODES ACC for what bandwidth there is available of the 3Mbps from overhead needed of the protocol. That's why the best you will ever see in practice is 512 Kbps.* (Windows, Linux, Cynogen Mod Android etc)

That's why people can "HEAR" the difference. Just having another Bluetooth device in the room can lower your Sound Quality you can get out of Bluetooth. Hence why some use the Data Band to stream audio as it is Immune from this as long as you are within range. Not a Problem in most cars.

I'm not so closed minded to think that a Protocol is the same as innovation of a patent. In fact, the Codec is the same, but the Application of that codec or standard that needs to be upgraded. You don't still drive on tires formulas from the 80's, why would you think your tires are the same? (Bias Ply non withstanding)

That's like saying Cheesburgers from McDoanlds are the same as Chicken nuggets. Sure they are still food, but that's the simplification you are using. And that's a LOT of Slack. :)


The codec by itself was made to operate on devices with the least amount of complexity therefore I.C.'s of the day could do it. We are talking about 1988 levels of Computing power from Portable DAC's and SOC's available at the time.

Your argument stems that Digital technology is the same. If that was True? DirectTV would only have a Few channels, and the Humble Satellite phone would never have came into existence.

In fact, you seem to have a Fundamental misunderstanding of what is being asked. OP asked,
So then it got me wondering if there is some limitation of older Bluetooth standards.
And to that, the Answer was YES. 2.1 is Fundamentally more loss in Audio then 3.0 or 4.0 since something you could not know unless you were apart of the Bluetooth Consortium (SALESH MISHRA- go look me up)
This has nothing to do with the underlying codec in fact. That's the kicker. But then I don't expect you to understand why. Like Spark Plug cables are all the same. They are not. In fact there are 3 kinds in mainstream use.

The Qualcomm codec has had 13 different revisions internally. However, even with 4.0, THE MANUFACTURES CHOOSE NOT TO USE THE NEW STANDARD. It's like if I said that RIAA equalization is worthless. Without out? You could not get the Dynamic range that you can today out of Records.

Let me say this again.

The Limitation is not the CODEC. Or Qualcomm. It is the fact that they are still too lazy to rewrite the code so that it still works with the new bandwidth made available and new side band that's taps into the WiFi 802.11 .XX (Forgot what subsection) that Bluetooth is able to use similar to how Miracast works with Direct pairing to the host device. (master and slave bi-directional)

You want to know WHY its not implemented yet? Sit back. Cuz this is inside info.

SOME DIVERSITY HIRE GOT PISSED OFF THAT IT WAS CALLED MASTER AND SLAVE in the HEX.
IN THE HEX MAN. HEX.. Its not even WRITTEN AS that in software. Just on the documentation. (ALA, IDE, master slave or C.S. fame)


Yes. You don't have FULL bandwidth HD streaming of AUDIO to REDBOOK standards (OR HDCD that was planned at 24 bits due to not having to worry about Error correction of CD Physcial media or transport) due to some woman named Amber that did not like the documentation of master and slave in the the Standard. And I never seen her do much of shit outside bitching about her FAT ASS. But now you know. :)


The second reason? CUZ NOT EVEN ALL OF CHINA CAN DO IT BETTER SINCE THEY CAN'T INNOVATE.
ONLY COPY THAT FLOPPY!


So bro, Do me a favor. Before you think you know what you are talking about? Let me know the last time you tried to file a Patent.



TO show you up. (Might as well) even if he used an Apple phone? Chances are that the ACC portion would not work.
Why? Well this deals with how 3rd party adapters do not support it out of the box. This is A BUG that 5.0 was suppose to solve. That would be a Patent infringement. And Apple has a Handshake that will just default to SBC as APTX is not often supported unless you custom mod your kernel audio stack or Jailbrake it.

In fact, APTx Does not make it sound better. It's actually made for 2 way low latency audio communication.(Read:GAMES)

In practice, you are still using the same compresor.

It goes IN PRACTICE:

iOS : HANDSHAKE>ACC data stream>DAC>DSP>FEEDBACKLOOP>Compressor(Quallcom-codec)-ACC[sbc/Fallback]>TransportLoop>Datestream>DAC
(VERY SIMPLIFIED)



In fact, the only real change to why 2.1 vs 3,0 sounds better has to do with the transport, not the codec of Bluetooth 3.0.
That the thing. ACC is not the answer why. Its the fact that ACC after the conversion is such a good codec for compression that even at "High" Bit-rates of 384, the DAC feedback loop that handles ECC does not need to handle much more then 100-250 has the ACC data packets are in VBR in a sense after the re-modulation and re-code of the data steam. so in practice, it will sound better with a simple sine wave, and be able to handle music better as much of it is not wide bandwidth in practice. Think rap music with a simple beat vs, a Symphony with full content from 35-16khz on average. The most sensitive area of Psy-Acoustics to the ear.

Hell. 99% f the people in the world, Bluetooth Audio is the BEST they ever heard. Why? Its the ONLY thing that is a real standard that is out there for EVERYONE to hear. In fact the Speakers are the only thing that is masking anything anyone can hear that will stand out to anyone.

But hey. I think I made my point.

Not to mention... USB Audio standard should have been made standard a LONG time ago. Everyone plugs in their phone. Wonders of wonders, its still not supported in most car head units. And the Wireless standard of that went away in 2005.
Why? Well why just make chips just for audio when VIDEO is what everyone wants?

Even Pioneer as an app that lets you stream REDBOOK audio to your deck. But then again, it does not work with things like Tidal, ( buggy) and some apps such as Spottily

You know whats really Stupid?
Thinking you give a shit.

264819
 

·
Listener of Music
Joined
·
3,225 Posts
Lots of the worlds still use old cell tech on networks, RCA cables still use copper and the speaker still uses a magnet. That was my point. It's great that you took the time to give that explanation but I didn't make my point clear. Just because it's old doesn't make it ineffective and just because there is better doesn't mean it is in common use.

You got condescending with me. I just made the insinuation that just because it's old doesn't make it ineffective

I do understand the science behind most of it. However it doesn't matter if what's better isn't widely used. Like electric cars in the 50's and 35 mpg cars in the 90's . They were there there just didn't get used.
 

·
Listener of Music
Joined
·
3,225 Posts
I definitely wasn't calling you stupid. Using "it's old" as a backing plate irks me. I was calling out that. Damn internet. Can't see each other's face while we are discussing.
 
1 - 12 of 12 Posts
Top