TL;DR: USB-C AirPods Pro support lossless audio with the upcoming Vision Pro headset due to the 5GHz band support in their H2 chip. The previous version only had 2.4GHz.
TL;DR: USB-C AirPods Pro support lossless audio with the upcoming Vision Pro headset due to the 5GHz band support in their H2 chip. The previous version only had 2.4GHz.
TL;DR There is enough bandwidth in 2.4GHz, but fuck you consumer, buy more AirPods.
According to Wikipedia the theoretical max bandwidth on the 2.4GHz bandwidth is 706,25 kbit/s downstream.
I don’t have data from Apple, but Qualcomms lossless Bluetooth audio transmits up to 1Mb/s.
So, a three minute internet search supports rather apples story than yours.
Yeah wiki also says.
Bluetooth 2.0 already supports 3mbps or (2.1mbps real world)
The bit rate of EDR is 3 Mbit/s, although the maximum data transfer rate (allowing for inter-packet time and acknowledgements) is 2.1 Mbit/s.
BT5 expands on the Low Energy specifications to allow 2mbps burst.
Bluetooth 5 provides, for BLE, options that can double the speed (2 Mbit/s burst) at the expense of range, or provide up to four times the range at the expense of data rate.
Also 802.11n already runs at 54mbps for a 20mhz wide channel.
Wow. Not only incorrect, but incorrect in the worst way by fucking up maths by a factor of a thousand!
What math? There is no math in there. I typed 1 unit incorrectly. One that didn’t actually matter for the argument.
But he did his own research! Checkmate!
Should I rather trust a random naysayer on the internet? I haven’t heard yet his numbers or sources, even. My argument still stands after the correction of a unit.
I think you meant 2.4GHz instead of 2.4kHz, and I think it can transmit a tad more than that given that Wi-Fi 2.4GHz had much more bandwidth than 1Mbit/s.
I’m not sure if you’re serious or trying to be sarcastic.
Bluetooth and WiFi are two different things.
For starters standard Bluetooth operates on 1MHz wide channels, BLE on 2MHz wide channels, whereas WiFi (nowadays) operates on 20 or 40 MHz wide channels.
Modern Bluetooth (on 2.4Ghz) can theoretically do bursts of 2Mbps, but in practice even 1Mbps is hard to hit in a sustained fashion.
2.4Ghz is just a frequency band and is not the same as bandwidth.
You might as well argue that a pickup truck and a formula 1 race car should be able to reach the same top speed in the same time because their wheel distance is the same.
Think again
deleted by creator
Honestly the most frustrating part is that there is plenty to criticize Apple on, so there’s no reason to get caught up in fabricated clickbaity nonsense.
But instead of focusing on genuine concerns, people would rather hop on some misinformation train.
All the while, if you espouse opinions that are bit more nuanced than “Apple bad”, then you must be a bootlicker like you said.
It’s as if people are more concerned about missing out on joining the hype and showing off their armchair skills, rather than exercising a modicum of critical thinking.
For real, lemmy is bad regarding this topic.
Started as a Unix sysadmin for over ten years ago and still love to use it but switched to macOS Client Administration. So I know both sides (and also windows on my gaming rig), but I hate to see this „xxx is shit“ Style of content. Just kills the vibe in the tech community.
Are you so sure about this? Perhaps I’m paranoid but I think the “pro” designation on the new A17 just means that it has a USB-3 bandwidth capable memory controller. The iphone 16 will have the A17(non-pro) and will therefore not have USB3 speeds. It’s just forward thinking. Apple SOCs have had monikers for years now like ‘bionic’ and ‘fusion’, and even the different core clusters like hurricane, zephyr etc.
NONE of them have been featured as prominently in the marketing/press material as the “pro” moniker of the A17. I don’t think that’s a coincidence, I think the term is more than just a name.
Ultimately, my point is that while you’re right that the usage of the A16 in the base iPhone 15 is sufficient reason for it not to have USB3 speeds implemented this year, I wouldn’t my breath for next year
True. Corrected.
About the bandwidth, that’s directly from Wikipedia
You can have audio of arbitrary bitrate. Lossless just means it isn’t being resampled or transcoded in a way that prevents exactly reconstructing the original signal. There’s no reason why you couldn’t support lossless audio up to 700Kbps, and the difference between 700kbps and 1mbps is well outside the range of perceptibility. You can also losslessly compress most audio that humans listen to by a significant degree, which is a completely transparent way to support higher bitrates if you can spare the processing time.
Lossless is understood to have a bitrate of at least 1411kbps, or about 1.4Mbps.
Theoretical sustained bandwidth capability of Bluetooth on the 2.4Ghz spectrum is 1Mbps, but in practice it’s a chunk lower in part due to overhead.
Even if we assume if you could just cram a higher bitrate through a smaller bandwidth (spoiler, you can’t), everyone would be up in arms about Apple lying about lossless and class action suits would ensue.
That said, you can’t. This is not like your internet connection where you’ll just be buffering for a minute.
As for what is and isn’t perceptible, I think you’re mixing up your tonal frequencies with your bitrates here.
No, lossless isn’t assumed to have a bitrate of at least 1.4Mbps.
Yes, lossless compression exists.
No, I am not mixing up bitrate and frequency. Yes, with a typical codec the difference between 700kbps and 1mbps is almost certainly imperceptible in almost all conditions.