Xcode 14 deprecates bitcode - but why?
Asked Answered
H

4

133

Xcode 14 Beta release notes are out, all thanks to the annual WWDC.

And alas, the Bitcode is now deprecated, and you'll get a warning message if you attempt to enable it.

And I was wondering, why has this happened? Was there any downside to using Bitcode? Was it somehow painful for Apple to maintain it? And how will per-iPhone-model compilation operate now?

Holliman answered 8/6, 2022 at 10:0 Comment(2)
The release notes say that Bitcode is not required for watchOS and tvOS and you will get a deprecation warning. I guess those platforms don't have sufficient variation to warrant bitcodeWhodunit
@Whodunit the release notes also say it's deprecated for other platforms too (further down the notes) "Because bitcode is now deprecated, builds for iOS, tvOS, and watchOS no longer include bitcode by default. (87590506)"Fineness
A
269

Bitccode is actually just the LLVM intermediate language. When you compile source code using the LLVM toolchain, source code is translated into an intermediate language, named Bitcode. This Bitcode is then analyzed, optimized and finally translated to CPU instructions for the desired target CPU.

The advantage of doing it that way is that all LLVM based frontends (like clang) only need to translate source code to Bitcode, from there on it works the same regardless the source language as the LLVM toolchain doesn't care if the Bitcode was generated from C, C++, Obj-C, Rust, Swift or any other source language; once there is Bitcode, the rest of the workflow is always the same.

One benefit of Bitcode is that you can later on generate instructions for another CPU without having to re-compile the original source code. E.g. I may compile a C code to Bitcode and have LLVM generate a running binary for x86 CPUs in the end. If I save the Bitcode, however, I can later on tell LLVM to also create a running binary for an ARM CPU from that Bitcode, without having to compile anything and without access to the original C code. And the generated ARM code will be as good as if I had compiled to ARM from the very start.

Without the Bitcode, I would have to convert x86 code to ARM code and such a translation produces way worse code as the original intent of the code is often lost in the final compilation step to CPU code, which also involves CPU specific optimizations that make no sense for other CPUs, whereas Bitcode retains the original intent pretty well and only performs optimization that all CPUs will benefit from.

Having the Bitcode of all apps allowed Apple to re-compile that Bitcode for a specific CPU, either to make an App compatible with a different kind of CPU or an entirely different architecture or just to benefit from better optimizations of newer compiler versions. E.g. if Apple had tomorrow shiped an iPhone that uses a RISC-V instead of an ARM CPU, all apps with Bitcode could have been re-compiled to RISC-V and would natively support that new CPU architecture despite the author of the app having never even heard of RISC-V.

I think that was the idea why Apple wanted all Apps in Bitcode format. But that approach had issues to begin with. One issue is that Bitcode is not a frozen format, LLVM updates it with every release and they do not guarantee full backward compatibility. Bitcode has never been intended to be a stable representation for permanent storage or archival. Another problem is that you cannot use assembly code as no Bitcode is emitted for assembly code. Also you cannot use pre-built third party libraries that come without Bitcode.

And last but not least: AFAIK Apple has never used any of the Bitcode advantages so far. Despite requiring all apps to contain Bitcode in the past, the apps also had to contain pre-build fat binaries for all supported CPUs and Apple would always only just ship that pre-build code. E.g. for iPhones you used to once have a 32 Bit ARMv7 and a 64 Bit ARM64 version, as well as the Bitcode and during app thinning, Apple would remove either the 32 Bit or the 64 Bit version, as well as the Bitcode, and then ship whats left over. Fine, but they could have done so also if no Bitcode was there. Bitcode is not required to thin out architectures of a fat binary!

Bitcode would be required to re-build for a different architecture but Apple has never done that. No 32 Bit app magically became 64 bit by Apple re-compiling the Bitcode. And no 64 bit only app was magically available for 32 bit systems as Apple re-compiled the Bitcode on demand. As a developer, I can assure you, the iOS App Store always delivered exactly the binary code that you have built and signed yourself and never any code that Apple has themselves created from the Bitcode, so nothing was server side optimized. Even when Apple switched from Intel to M1, no macOS app magically got converted to native ARM, despite that would have been possible for all x86 apps in the app store for that Apple had the Bitcode. Instead Apple still shipped the x86 version and let it run in Rosetta 2.

So imposing various disadvantages onto developers by forcing all code to be available as Bitcode and then not using any of the advantages Bitcode would give you kinda makes the whole thing pointless. And now that all platforms migrated to ARM64 and in a couple of years there won't even be fat binaries anymore (once x86 support for Mac has been dropped), what's the point of continuing with that stuff? I guess Apple took the chance to bury that idea once and for all. Even if they one day add RISC-V to their platforms, developers can still ship fat binaries containing ARM64 and RISC-V code at the same time. This concept works well enough, is way simpler, and has no downsides other than "bigger binaries" and that's something server side app thinning can fix, as during download only the code for the current platform needs to be included.

Arenicolous answered 3/8, 2022 at 10:6 Comment(9)
Interesting. So they annoyed us for years with all these Bitcode hassle - for absolutely nothing? Funny. I just came across this post here, because Xcode 14 wants to force me to enable Bitcode for the Pods of an existing Unity project. Otherwise, the build fails. I have no clue why this is the case, if they dropped Bitcode support. Does not make any sense to me. In Xcode 13, the project was building just fine.Descendant
@Descendant The idea to have apps in a CPU neutral form available on the app store is not a bad one; that's why Android chose Java Byte Code (JBC). Yet JBC is a pretty stable representation that is well documented and understood, Bitcode isn't. Also on Android the device itself transforms JBC to CPU code (AOT nowadays), which Apple didn't want to happen, so their servers would have to perform that task and I can see all kind pitfalls with that concept. As for your concrete problem, create a new question and provide some log output there, someone might know the answer to your issue.Arenicolous
"Apple never used bitcode" isn't true. The transition to 64-bit watchOS involved recompiling all existing armv7 apps to a wacky transitional arm64_32 architecture (which was arm64 but with 32-bit pointers) using bitcode. They also attempted to use bitcode to enable Swift Concurrency backdeployment for apps built with Xcode 13.1, but that was only ever enabled for TestFlight as it mostly just caused problems. It's also worth noting that Apple's bitcode format is not the same thing as LLVM bitcode, and actually was frozen.Imperative
@ThomasGoyne Apple's bitcode format cannot be frozen, as if LLVM introduces a new feature, which requires changes to their bitcode format, they just change it; now how would that feature then translate to Apple's bitcode format if frozen? Then Apple could not offer that feature at all as they cannot express it but that has never happened AFAIK; and such changes of the LLVM bitcode format did happen in the last few years a couple of times.Arenicolous
Apple's bitcode format only made additive changes. Bitcode generated by older compilers could be consumed by newer compilers, but not vice-versa. LLVM's format by comparison can only be consumed by the same version as it was created by, as they change the meaning of existing things. I suspect the difficulty in expressing everything as purely additive changes may have been a factor in them abandoning bitcode.Imperative
Thank you for the insights. In one project we just analyzed the app size and when exporting it with Xcode manually and selecting optimize for all platforms, we get app bundles being 30 - 32MB large. When uploading to the AppStore however the fat bundle is >90MB resulting in all users downloading more than 3x the app size they seem to need. Is there any way now to have device specific bundles from the AppStore?Hugmetight
@Hugmetight Just make that an own question, that's what this platform exists for.Arenicolous
I have seen the app resigned/edited by Apple before, does this count? The app update changelog says something like "This app has been updated by Apple to display the Apple Watch app icon."Valois
When bitcode disabled I don't see .bsymbolsmap any more? so how the dysmobolication will work? assuming I am shipping an SDKLangsdon
C
11

Apple Watch Series 3 was the last device to not support 64-bit. (i.e. i386 or armv7)

Apple has now stopped supporting the Apple Watch Series 3. [1] They would have been happy to drop support for bitcode.

[1] https://www.xda-developers.com/watchos-9-not-coming-apple-watch-series-3

Carmeliacarmelina answered 9/6, 2022 at 1:48 Comment(4)
Was bitcode useful for 32bit and 64bit simultaneous bundling?Abbreviated
I think you are probably right, but I am curious if that is definitely the reason - was bitcode only for thinning builds from perspective of CPU architectures? Did it have nothing to do with serving the correct image assets (different resolutions, for example) to the right devices, like Google Play Store split APKs from their App Bundle format developer.android.com/guide/app-bundle ?Frangible
App Thinning has nothing to do with Bitcode.Renaldo
This doesn’t answer the question at all. The question - above - is why is Apple deprecating bitcode?Agma
A
5

xcode remove armv7/armv7s/i386 targets support. bitcode use for build different cpu targets. but now all devices might be arm64 . and now no more developer use this tech. so deprecated maybe a wise choice

Amathiste answered 10/6, 2022 at 10:9 Comment(0)
R
3

Bitcode was always pointless, as even if you compiled bitcode to another architecture, there's high chance it won't actually work because the ABI is different. For example, when you compile C program, the libc headers actually are different for every architecture. I'm glad they are finally getting rid of it, as it caused more problems than solved. At most they could've done is re-optimize the binary for the same architecture, or similar enough architecture. There is also the problem of unwanted symbols leaking in bitcode builds, so you either have to rename/obfuscate those or get hit by collisions (big problem if you are a library/framework vendor).

Resolve answered 11/11, 2022 at 6:54 Comment(0)

© 2022 - 2024 — McMap. All rights reserved.