Skip to content

Latest commit

 

History

History
1213 lines (924 loc) · 45.8 KB

notes.org

File metadata and controls

1213 lines (924 loc) · 45.8 KB

Description of llvmng and bitcode and backend for haskell https://mail.haskell.org/pipermail/ghc-devs/2017-September/014689.html

macOS where the mach-o format is used the linker uses a feature called ‘subsections_via_symbols’ to strip dead code

  • works by assuming all code bwteen two symbols belongs to the first symbol

stripper deletes the code of symbol if it determines the symbol is not used so it deletes all the code below the symbol until the next symbol

ghc uses table next to code with ‘prefixdata’ in llvm feature which puts data before the symbol about the symbol

so cant strip any code produced by ghc on mac or macos

looks like the motivation for the llvmng and bitcode backend was make the binaries smaller.

Mach-O

https://en.wikipedia.org/wiki/Mach-O

Talk about cross comp https://www.youtube.com/watch?v=46A02obKt8g&list=PLbjcAmsCYuS4xspQb5BHnIHhi4BrWteGz&index=7

Template Haskell

iserv-proxy and GHCSlave

  • Need to acquire iserv-proxy program that runs on the build machine.
    • the iserv-proxy can be obtained from the ghc source tree.
    • within ghc/utils/remote-proxy
      • to build, need to run ./boot && ./configure <options> in ghc/
      • then go to ghc/utils/remote-proxy and then run make
      • this takes a long time
  • Are there any caches available?
    • The dependency of remote-iserv on the main branch 9.3 is just libiserv
    • libiserv might be on hackage and therefore available in the nixpkgs cache
    • 9.0.1 is available https://hackage.haskell.org/package/libiserv
    • looks like libiserv comes with ghc, so maybe just need to get the 8.10.7 branch up
      • Correct that don’t need to compile ghc, but looks like remote-proxy library depends on a module that doesn’t exist.
        • i.e Remote.Message
    • So probably will need to look for a version of haskell that has a Remote.Message module.
      • Although libiserv does include the Remote.Message module, maybe cause I’m using nix?
        • Wasnt exactly cause of nix, the

iserv

  • need to cross compile iserv so that i have iserv-exe
    • cross compiler version is 8.4.0 but only 8.4.1 source distribution is available
    • run into an error where there is a missing header file, maybe the ghc 8.4.0 version doesn’t ship with the header
      • had to change the path from a relative one to a absoulute one in /isrev/cbits/iservmain.c
  • Need to acquire GHCSlave program that runs on target.

Haskell Build System

https://gitlab.haskell.org/ghc/ghc/-/wikis/building/architecture https://gitlab.haskell.org/ghc/ghc/-/wikis/building/hadrian https://gitlab.haskell.org/ghc/ghc/-/wikis/building/preparation https://gitlab.haskell.org/ghc/ghc/blob/master/hadrian/doc/make.md https://gitlab.haskell.org/ghc/ghc/-/blob/master/hadrian/README.md

https://gitlab.haskell.org/ghc/ghc/-/wikis/building/using https://gitlab.haskell.org/ghc/ghc/-/wikis/contributing https://gitlab.haskell.org/ghc/ghc/-/wikis/building#building-and-porting-ghc

Haskell Cabal and Cross Compilation of Libraries

https://log.zw3rk.com/posts/2017-05-19-ghc-s-cross-compilation-pipeline/

Cabal Cross Compilation

https://log.zw3rk.com/posts/2017-05-17-the-haskell-cabal-and-cross-compilation/ cabal will use a non-prefixed toolchain, which results in the library being compiled for the build machine.

cabal provides the cnecessary arguments to pass in the toolchain.

–builddir=dist/arm-linux-gnueabihf

–with-ghc=arm-linux-gnueabihf-ghc –with-ghc-pkg=arm-linux-gnueabihf-ghc-pkg –with-gcc=arm-linux-gnueabihf-clang –with-ld=arm-linux-gnueabihf-ld –hsc2hs-options=–cross-compile

–configure-option=–host=arm-linux-gnueabihf

https://github.com/ghc-ios/ghc-ios-scripts

GHC and Libraries

https://downloads.haskell.org/~ghc/latest/docs/html/users_guide/packages.html https://downloads.haskell.org/~ghc/latest/docs/html/users_guide/separate_compilation.html https://downloads.haskell.org/~ghc/latest/docs/html/users_guide/shared_libs.html https://downloads.haskell.org/~ghc/latest/docs/html/users_guide/phases.html https://downloads.haskell.org/~ghc/latest/docs/html/users_guide/utils.html hsc2hs

Nix Flakes

https://www.tweag.io/blog/2021-12-20-nix-2.4/

Part 1

https://www.tweag.io/blog/2020-05-25-flakes/

Original Nix expressions can access

  • arbitrary files, e.g ~/.config/nixpkgs/config.nix
  • environment variables
  • git repositories
  • files in Nix search path $NIX_PATH
  • command-line arguments –arg
  • and the system type builtins.currentSystem

No standard way to compose nix based projects. Typical ways to compose Nix files rely on Nix search path

  • e.g import <nixpkgs> or use fetchGit or fetchTarball
    • poor reproducabilty and and bad user experience due to usage of git hashes

No easy way to deliver Nix-based projects to users. Nix has a ‘channel’ mechanism, but it’s not easy to create channels and they are not composable.

Nix projects lack standardised structure.

  • there are conventions e.g shell.nix or release.nix
    • but dont cover many common use cases
      • no way to discover NixOS modules provided by repository

Flake is the solution

Flake is just a source tree. e.g git repository

  • containing a file named flake.nix
    • provides standardized interface to Nix artifacts
      • such as
        • packages
        • NixOS modules

Flakes can have deps on other flakes

  • With a lock file pinning those deps to exact revs

Using Flakes

https://github.com/edolstra/dwarffs it is a flake because it contains a file names flake.nix

  • it tells nix what the flake provides
    • Such as
      • Nix packages
      • NixOS modules
      • CI tests

Bulding a Cross Compiler

  • Scan which resources and blog posts would be useful.

zw3rk Blog

2021

https://log.zw3rk.com/posts/2021-06-28-off-by-one/

  • Story about static linker for Mach-O file format on AArch64.

2018

Q1 Jan - March

https://log.zw3rk.com/posts/2018-01-09-what-is-new-in-cross-compiling-haskell/

  • Talks about providing cross compiler binaires on hackage.mobilehaskell.org

https://log.zw3rk.com/posts/2018-01-12-talk-building-android-apps-with-haskell/

  • Talk about building Android apps with Haskell.

https://log.zw3rk.com/posts/2018-01-17-provisioning-a-nixos-server-from-macos/

https://log.zw3rk.com/posts/2018-02-05-what-is-new-in-cross-compiling-haskell/

  • Talks overlays in the repo being reflected in the server.
  • Talks about writing a patch/fix for cabal so that it works better for cross-compilation
    • So that it propagates the –with-PROG flags into dependencies.
  • This must be things like: –with-ghc=arm-linux-gnueabihf-ghc –with-ghc-pkg=arm-linux-gnueabihf-ghc-pkg –with-gcc=arm-linux-gnueabihf-clang –with-ld=arm-linux-gnueabihf-ld
    • Talks about SLURP and Uncurated Hackage Layer

    https://log.zw3rk.com/posts/2018-03-02-what-is-new-in-cross-compiling-haskell/

    • Talks about cabal –with-PROG file properly respected when using new-build
    • Talks about working for IOHK
      • Assisting DevOps team Cross Compile Haskell with GHC
  • From Windows to Linux
    • Says will broaden and improve GHC’s cross compilation capabilites.
    • Talks about being on a good road getting all sorted in GHC 8.6
    • Hopes that GHC 8.6 can be built by default with the shake based build system
      • hadrian will have extenseive cross compilation capabilites for various platforms.
    • Talks about work done making haskell cross compile from linux to windows:

    https://log.zw3rk.com/posts/2018-03-14-talk-case-study-cross-compiling-dhall-json/

    • Gave a talk about how to cross compile dhall-json to raspberry pi.
      • Cross compile non trivial haskell packages and that issues such as
  • ghc-head
  • build-type
  • Template Haskell
    • Have mostly trivial fixes of which most can be upstreamed.
    • Talk coincided with release of GHC 8.4 and release of new zlib package to hackage.
      • renders the fix to zlib unnecessary
    • Fix for contravarient crude.
Q2 April - June

https://log.zw3rk.com/posts/2018-05-03-what-is-new-in-cross-compiling-haskell/

  • Talks about finished hadrian PRs
    • Now can build relocatable GHCs with hadrian by default.
  • Windows compilation can now be done via WINE
    • Don’t need windows installation with iserv
    • Just run iserv via WINE
  • Talks about working with nix and limitations with respect to cross compilation.
    • Specifically flattening of conditionals (os/arch/flags) that cabal2nix does.
Q3 July - September

https://log.zw3rk.com/posts/2018-07-04-what-is-new-in-cross-compiling-haskell/

  • Adpating the llvm-ng backend to build a fresh set of pre-built 8.6.1 cross compilers.

https://log.zw3rk.com/posts/2018-08-14-what-is-new-in-cross-compiling-haskell/

  • Playing with -target
    • Believes that best solution is to have minmimal ghc that doesn’t ship with any libs.
      • All libs should be built on demand per target.
  • Likely want to pre-build and ship the Runtime System Library rts as do not have cabal package that would build the rts.
    • Would need partial target toolchain to build the rts for all the bundled rtss to be shipped.
      • Other side, likely want to use iserv e.g the -fexternal-interpreter
        • Run into some strange behaviour while compiling test-suite packages
  • iserv complains code loaded multiple times.
  • exploring how to get proper test-coverage for libraries
    • even ghc in a cross compiled setting
      • Fixed -staticlib argument doesn’t fail if object files in the archives it’s trying to concat are odd-length
        • GHC doesn’t panic anymore when -jN, N>1 is used and it fails to find/load a library.
      • llvmng code to work with ghc8.6, retraced performance imporovement
      • Use aws compute time to build cross compiler once the final 8.6.1 hits. (must be talking about windoes here)
Q4 Oct - Dec

https://log.zw3rk.com/posts/2018-10-09-what-is-new-in-cross-compiling-haskell/

  • llvm-ng, cmm, custom ghc, dump cmm
    • decoupling code generator end ghc front end
  • Usually
    • GHC reads file
    • Turns into AST
    • Desugares
    • Runs Optimizations
    • Turns into STG
    • Turns into cmm
  • so far cmm wasn’t binary serilaizable
    • had to plug code generator in ghc
  • and have frontend run
    • then call code gen
      • talks about minimalst ghc, ghc should be packaged with:
        • rts
        • ghc
        • ghc-prim
        • integer-gmp
        • integer-simple
        • base
        • array
        • deepest
      • which means Cabal needs to be bootstrapped
      • prefer to get rid of template-haskell
        • however ghc is linked against it
        • not shipping and reinstalling different, would potentially break things depending on TH
      • a way around, to use external interpreter only
        • could recompile external interpreter against your changed TH library.
      • ghc will likely ignore {-# ANN … “HLint: … #-} and provide {-# HLINT … #-} pragma.
      • would like to get minimal ghc dist working for -target first

2017

Q2 April - June

https://log.zw3rk.com/posts/2017-04-20-hello-world-and-a-cross-compilation-survey/ Survey

https://log.zw3rk.com/posts/2017-05-03-building-iconv-for-android/ check <-

https://log.zw3rk.com/posts/2017-05-09-cross-compilation-survey-results/ Survey

https://log.zw3rk.com/posts/2017-05-10-quick-headless-raspberry-pi-setup/

https://log.zw3rk.com/posts/2017-05-11-making-a-raspbian-cross-compilation-sdk/

https://log.zw3rk.com/posts/2017-05-16-a-haskell-cross-compiler-for-raspberry-pi/

https://log.zw3rk.com/posts/2017-05-17-the-haskell-cabal-and-cross-compilation/

https://log.zw3rk.com/posts/2017-05-18-why-use-a-cross-compiler/ Why use cross compiler

https://log.zw3rk.com/posts/2017-05-19-ghc-s-cross-compilation-pipeline/ check <-

https://log.zw3rk.com/posts/2017-05-23-template-haskell/ th <-

https://log.zw3rk.com/posts/2017-05-24-template-haskell-and-cross-compilation/ th <-

https://log.zw3rk.com/posts/2017-05-25-cross-compiling-template-haskell/ th <-

  • GHCs external interpreter, splits ghc into 2 components
  • ghc
  • interpreter server iserv
    • Passing -fexternal-interpreter to ghc will spawn an iserv instance and run interpreted code through it.
    • ghc instruct iserv to load and link libraries as needed, and eval bytecode objects.
    • iserv can query ghc for current compilation env during eval.
    • split iserv into two parts
  • iserv-proxy , serves as iserv interface to GHC on the build machine.
  • GHCSlave , on the target machine
    • first need a cross compiler
    • Building iserv
  • iserv-proxy is built with the build machine ghc as it runs on the build machine
  • iserv-bin, which contains iserv-proxy of the ghc tree can be found in iserv subfolder of ghc

https://log.zw3rk.com/posts/2017-05-27-cross-compiling-yesod-to-raspberry-pi/

https://log.zw3rk.com/posts/2017-05-30-a-haskell-cross-compiler-for-android/ check <-

https://log.zw3rk.com/posts/2017-05-31-android-and-template-haskell/ th <-

https://log.zw3rk.com/posts/2017-06-02-what-is-new-in-cross-compiling-haskell/ This contains some highlights and summaries.

https://log.zw3rk.com/posts/2017-06-06-a-haskell-cross-compiler-for-ios/ check <-

https://log.zw3rk.com/posts/2017-06-07-ios-and-template-haskell/ th <-

  • Wrap GHCSlave (remote iserv) instance into an application for iOS.
  • Build the GHCSlave iOS application with the iOS cross compiler.
  • Build iserv-proxy. -> this is to run on the build machine
  • Build iserv library. -> this is to run on the target machine
  • ghc/iserv $ aarch64-apple-ios-cabal install -flibrary
    • ghc/iserv $ x86_64-apple-ios-cabal install -flibrary
    • clone the cross compiler ghc version, need donwload that source distribution
    • Need to build stati library and wrap it into a native iOS app.
  • Code for the slave app can be found in the iOS folder of ghc-slave
aarch64-apple-ios-ghc -odir arm64 -hidir arm64 -staticlib -threaded -lffi -L/path/to/libffi/aarch64-apple-ios/lib -o hs-libs/arm64/libhs.a -package iserv-bin hs/LineBuff.hs
x86_64-apple-ios-ghc -odir x86_64 -hidir x86_64 -staticlib -threaded -lffi -L/path/to/libffi/x86_64-apple-ios/lib -o hs-libs/x86_64/libhs.a -package iserv-bin hs/LineBuff.hs
lipo -create -output hs-libs/libhs.a hs-libs/arm64/libhs.a hs-libs/x86_64/libhs.a
# we need -threaded as the startSlave function calls forkIO to start the slave in a separate thread
Q3 July - September

https://log.zw3rk.com/posts/2017-07-06-what-is-new-in-cross-compiling-haskell/ Some summary

https://log.zw3rk.com/posts/2017-08-03-what-is-new-in-cross-compiling-haskell/ Figuring out how to buld distirbutable binaries of GHC cross compilers

https://log.zw3rk.com/posts/2017-09-05-what-is-new-in-cross-compiling-haskell/

  • Bulding llvm backend, integrating llvm bitcode backend into ghc.
  • -fllvmng backend can compile GHC, but fails to validate
Q4 Oct - Dec

https://log.zw3rk.com/posts/2017-10-08-what-is-new-in-cross-compiling-haskell/

  • ICFP, cross compilation diffs merged
  • Q monad extension for TH ajdustments
  • announcement of head.hackage overlay
  • allows to have set of patches which are turned into separate hackage repo
  • patches picked from repo rather than from upstream hackage repo if a patched package exists
  • hackage overlay patches the packages on the server side and provide separate hackage repo
    • which takes prescedence over the upstream hackage repo
      • built hackage.mobilehaskell.org
  • so far a single patched package, zlib
  • https://github.com/mobilehaskell/hackage-overlay
    • also contains jexperimental ghc binary dists, built with llvmng llvm backend

https://log.zw3rk.com/posts/2017-10-20-ghc-cross-compiler-binary-distributions/ <- check (outdated using make?)

https://log.zw3rk.com/posts/2017-10-30-building-ghc-the-package-database/ ghc’s build system and ghc-pkg

https://log.zw3rk.com/posts/2017-11-11-what-is-new-in-cross-compiling-haskell/

  • ghc 8.4.1 is comming
    • come with a shake based build system called hadrian
  • make based build system will eventually be dropped.
    • after investigating make, bit the bullet and went with hadrian
    • gonna write about GHCs build system
    • ideal cross compiler contains a bin and lib folder, simply unpacking and running bin/ghc.
  • requires the distribution is relocatable.
    • to achieve this, would be great if the build system would place the package database relocateable into lib and the binaries into bin
    • created a fat PR into hadrian to make it possible

https://log.zw3rk.com/posts/2017-11-11-building-ghc-the-tools/ <- check – tools to build GHC

https://log.zw3rk.com/posts/2017-11-22-building-ghc-the-stages/ <- check – GHC build stages

https://log.zw3rk.com/posts/2017-12-08-what-is-new-in-cross-compiling-haskell/

  • was workingon hadrian fork to allow building relocatable binary distributions for GHC, specifically for cross compilers
  • will update hackage.mobilehaskell.org with these builds
  • alp mestangullari taking over branch, hoping merge chunks of branch into upstream hadrian

https://log.zw3rk.com/posts/2017-12-20-relocatable-ghc-cross-compiler-binary-distributions/ <- check – talk about relocatable binary dist

https://log.zw3rk.com/posts/2017-12-21-contributing-to-ghc/ contributing to ghc

The toolchain wrappers

https://github.com/zw3rk/toolchain-wrapper <- most recently commited to https://github.com/ghc-ios/ghc-ios-scripts

toolchain-wrapper / ghc iOS scripts

  • ghc iOS scripts talks about the stubs.
  • So these are mainly intended for iOS, but toolchain-wrapper has stuff for
    • RaspberryPi (arm-linux-gnueabihf)
    • Android (armv7-linux-anroideabi, aarch64-linux-android)
    • iOS (x86_64-apple-ios, aarch64-apple-ios)
  • Wrapped commands are:
    • gcc
    • clang
    • ld
    • ld.gold
    • nm
    • ar
    • ranlib
    • cabal
  • xcode is a dependency
  • wrapper is the core script
    • it reads its own program name (command, and target) and maps them to the (command and arguments)
targets="arm-linux-gnueabihf x86_64-linux-android armv7-linux-androideabi aarch64-linux-android x86_64-apple-ios aarch64-apple-ios wasm32-unknown-unknown-wasm aarch64-apple-darwin arm64-apple-darwin"
commands="clang ld ld.gold nm ar ranlib cabal llvm-dis llvm-nm llvm-ar"
  • boostrap creates program names and symlinks them to the wrapper

When the command name matches cabal:

*-cabal)
	  fcommon="--builddir=dist/${target}"
	  fcompile=" --with-ghc=${target}-ghc"
	  fcompile+=" --with-ghc-pkg=${target}-ghc-pkg"
	  fcompile+=" --with-gcc=${target}-clang"
	  fcompile+=" --with-ld=${target}-ld"
	  fcompile+=" --with-hsc2hs=${target}-hsc2hs"
	  fcompile+=" --hsc2hs-options=--cross-compile"
	  fconfig="--disable-shared --configure-option=--host=${target}"
	  case $1 in
	      configure|install) flags="${fcommon} ${fcompile} ${fconfig}" ;;
	      build)             flags="${fcommon} ${fcompile}" ;;
	      new-configure|new-install) flags="${fcompile} ${fconfig}" ;;
	      new-build)         flags="${fcompile}" ;;
	      list|info|update)  flags="" ;;
	      "")                flags="" ;;
	      *)                 flags=$fcommon ;;
	  esac;;

When the command matches hsc2hs:

*-hsc2hs) flags=" --cross-compile" ;;

When the command matches various apples:

# iOS -- this us run though apples xcrun tool.

aarch64-apple-ios-clang|aarch64-apple-ios-ld)
	  flags="--sdk iphoneos ${cmd} -arch arm64"
	  cmd="xcrun" ;;

aarch64-apple-ios-*|aarch64-apple-ios-*)
	  flags="--sdk iphoneos ${cmd}"
	  cmd="xcrun" ;;

# iOS (64bit simulator)
x86_64-apple-ios-clang|x86_64-apple-ios-ld)
	  flags="--sdk iphonesimulator ${cmd} -arch x86_64"
	  cmd="xcrun" ;;

x86_64-apple-ios-*)
	  flags="--sdk iphonesimulator ${cmd}"
	  cmd="xcrun" ;;


# looks like these are not for the phone
aarch64-apple-darwin-clang|aarch64-apple-darwin-ld|arm64-apple-darwin-clang|arm64-apple-darwin-ld)
	  flags="--sdk macosx ${cmd} -arch arm64"
	  cmd="xcrun" ;;

aarch64-apple-darwin-*|arm64-apple-darwin-*)
	  flags="--sdk macosx ${cmd}"
	  cmd="xcrun" ;;

When the command matches llvm

*-llvm-*|wasm32-*-nm|wasm32-*-ar|wasm32-*-ranlib) cmd="llvm-$cmd" ;;

When the command matches the other tools:

# they retain their original cmd and flags
*-nm|*-ar|*-ranlib) ;;

For all apples, it just runs the command:

*) exec $cmd $flags "$@" ;;

$@ refers to all the scripts command line arguments

https://github.com/ghc-ios/ghc-ios-scripts/blob/master/ghc-ios Shows some examples of the flags used with the ghc’s

XCode in Nix

http://sandervanderburg.blogspot.com/2012/12/deploying-ios-applications-with-nix.html https://github.com/NixOS/nixpkgs/tree/master/pkgs/os-specific/darwin https://github.com/NixOS/nixpkgs/blob/release-21.11/pkgs/top-level/darwin-packages.nix https://github.com/svanderburg/nix-xcodeenvtests

lrwxr-xr-x  1 sander  staff  94  1 jan  1970 Simulator -> /Applications/Xcode.app/Contents/Developer/Applications/Simulator.app/Contents/MacOS/Simulator
lrwxr-xr-x  1 sander  staff  17  1 jan  1970 codesign -> /usr/bin/codesign
lrwxr-xr-x  1 sander  staff  17  1 jan  1970 security -> /usr/bin/security
lrwxr-xr-x  1 sander  staff  21  1 jan  1970 xcode-select -> /usr/bin/xcode-select
lrwxr-xr-x  1 sander  staff  61  1 jan  1970 xcodebuild -> /Applications/Xcode.app/Contents/Developer/usr/bin/xcodebuild
lrwxr-xr-x  1 sander  staff  14  1 jan  1970 xcrun -> /usr/bin/xcrun 

Cabal management

https://stackoverflow.com/questions/25765893/how-do-i-install-dependencies-when-cross-compiling-haskell-code

Stages of GHC Compilation

  • clang is failing for cabal
    • can I manually compile the stage?
  • can investigate ghc-pkg
    • should make a shell

mobile-core-log summary

Questions

  • What are statically compiled musl binaries?
    • What does it mean by not resolving and crashing?
      • Symbols don’t exist?
      • Why would it crash?
  • What are glib compiled binaries?
    • Are they a universal sort of binary?
  • What does it mean by linker64 not being readily available?
  • Does architecture difference really affect haskell code execution?
    • Seems like only if using IO
    • If not, then just generate TH splices using build machine

Reading

https://github.com/zw3rk/mobile-core-log/blob/master/LOG.md

iOS:

  • build for macOS
  • patch up objects load commands from macOS -> iOS
    • to get around the linker being overly zelous

Android:

  • Multiple architectures.
    • For now only focus on aarch64
  • Can’t use statically compiled musl binaries
    • They don’t resolve and crash (missing symbols?)
  • Can’t use glib compiled binaries
    • They have more missing symbols
  • Can’t use bionic (androids libc) compiled libraries
    • By default dynamic
    • linker64 isn’t readily available
    • Bulding linker64 from google sources is a pain.
      • They use a build tool called soong.
        • It replaced their make files, and will soon be replaced by bazel.
      • building soong is barely documented
    • building bionic libc from source requires soong
      • could construct a makefile
    • building static libc, libm,… from bionic should allow to build android executables that just run on linux as long as not using android APIs
    • building linker64 should allow to run android exectuables on linux as long as they don’t use androi apis
  • Android
    • aarch64
    • aarch32
    • x86_64
    • mips
    • risc-v
    • maybe others
  • executable binary
  • machine code
  • container formats
  • linking terminology
  • thoughts on c standard library
  • code -> assembly printer -assembler-> byte stream of instructions specific to target CPU
    • x86_64/aarch64

https://godbolt.org/

  • three major container formats for applications
    • Portable Executable (PE) on Windows
    • Mainly ELF on windows
    • Mach-O pretty much exclusively on Apple
  • The file formats are what the kernel reads
    • Decide where and what executable code to place where in memory
    • Nice to have metadata
      • allow us to have different sections that describe
        • machine code (text sections)
        • data sections (literal strings and stuff)
  • When kernel loads application into memory
    • it will fux up referenes in the instructions
      • so that pointers point to the e.g string literals in data section
    • this process is called relocation
  • if ELF file
    • can query relocations with
      • readelf
  • mach-o
    • otool
  • PE’s
    • probalby some tool as well
  • lib.c -> lib.o (an object file, in container format of OS)
    • thus wont be identical across different operating systems
  • main.c uses libraries in lib.c
    • main.o will have symbolic references (symbosl)
      • to something that’s not contained in main.o
      • can’t trun main.o straight into the main executable
        • we will run into the problem of not being able to call the library function at runtime
    • library function will not be able to be found
  • the linker ld will complain about this
    • that it cant turn main.o on its own into an executable
  • provide the linker with lib.o and main.o and it will link
    • need to provide the object files in a certain order
  • so we statitcally linked lib.o and mian.o into main
    • most straigthforward way of linking
      • allows linker to discard symbols that are not used from the final product
  • dynamic linking and dynamic shared objects (DSOs)
    • .dll dynamic link lib
    • .so shared object
    • .dylib dynamic lib
  • tell the linker to link shared lib
    • wont include into final exe
      • exe will reference lib.dll/.so/.dylib
  • kernel has loader program that delegates dynamic library resolution and linking
    • which program to use can be encoded in the container
      • (PE, ELF, Mach-O)
    • kernel just needs to know which program to hand control over for linking
  • .a files (archives) is a collection of .o files
    • archive files across platforms don’t differ too much
      • but not enough for 1:1 compatability
  • can iterate over each object file in archive
  • archives can have multiple entries with the same name
  • can also encode any kind of data in archives, not just obj files
  • if our .a contains multiple .o’s we can just pass the .a to the linker and it will link
  • if we provide our linker with -l<name>, it will look for lib<name>.<ext>
    • in the paths we pass via -L/path/to/some/where
    • if we ask the linker to produce a shared executable
      • via -shared flag
  • it will look for .so or .dylib first and then fall back to .a
    • -static
  • it will look for .a only
  • compiler will pass a few default libs to the linker
    • unless you specifically ask not to via -nostdlibs
    • compiler may be giving you libc, libm, and, libdl (dynamic loading)
  • underneath libc are system calls
    • system calls tell that kernel to invoke a kernel function
    • sys calls are neither necessarily stable across kernel versions, nor across the same operating system
      • thus every OS comes with its own libc
  • there can be multiple libraries that implement system calls (multiple libc’s)
    • not common for windows or macOS
    • but a few for linux
      • glibc gnu libc
      • musl
      • diet
      • and others including
  • bionic the libc for android
    • android is a linux kernel with a custom libc called bionic
      • its a mash of bsd libc’s with some gogole libc and other features.
    • Can recompile the main.c on various platforms and have it work
      • as they all provide libcs that provide printf
        • and with the same semantics
      • this is where standards for those libraries arise. i.e POSIX
        • if your libc is POSIX standard compliant.
  • software written against POSIX functions should work on any operating system that comes witha kernel and alibc that implements those functions
  • if you programming lang doesnt want to rely on any of this libc logic
    • it can provide low level implementation
      • e.g instead of calling printf and other functionality provided by the libc
  • can just call into the kernel directly with sys calls (still seems to require standards such as POSIX)
    • this is what GO did
    • requires that rely on stability of syscalss
  • usually sys calls maintain back ward compat across kernel versions
    • but this is a different case for macos
      • no guarantee
    • best to use the libc of mac i.e libSystem
    • kernel doesn’t provide loader for DSOs, so need to provide our own
      • if we build a fully static exec, only need a kernel that supports the syscals of the libc we linked into our exec calls.
  • System triples: <architecture>-<vendor>-<operating system>
    • architectures:
      • aarch64 (alias arm64)
      • x86_64 (alias amd64)
      • i386
      • ....
    • vendors:
      • pc
      • none
      • unknwon
      • apple
    • operating-system:
      • kernel describes os sufficiently enough
      • macos
      • ios
      • android
      • windows
      • darwin
      • freebsd
      • linux
        • darwin is umbrella across macos, ios, tvos
    • anything that supposed to work on apples kernel with libSystem
      • darwin12 with system version specified
        • similarly freebsd12
        • also theres
    • linux-gnu
    • linux-musl
      • to indiciate libc
  • aarch64-darwin
    • means apple operating system on a 64bit arm cpu
  • aarch64-android
  • aarch64-linux-bionic
    • are synonymous
  • a lot of effort goes into making autotools parse triples
    • vendor can be optional
  • aarch64-ios and aarch64-macos are both aarch64-darwin
    • if we build a lib on aarch64-macos
      • with a native stage2 (opposed to cross compiler)
      • its the same as aarch64-ios?
  • GHC Stages:
    • When building GHC, use a boostrap (stage0) compiler to build stage1 compiler
    • depending on difference between stage0 compiler and stage1
      • might run into suble bugs
        • stage1 does not follow the
    • same codegegen or
    • calling convetions
      • as stage0 compiler we used to build it with
        • to overcome, need to build ghc from source again
    • but this time use stage1 compiler
    • this yields full stage2 compiler
      • we’re running the compilers on the same sources?
  • If a cross compiler would compile itself
    • it would produce a compiler for the target
      • but not for the host we want it to run it on
        • thus cross compilers are limited to stage1 compilers
  • ideally we build cross compilers as virtual stage3 compilers
    • we build the cross compiler with a boostrap compiler built from the identical sources
  • can we not use a stage2 compiler and then use that compiler on the cross compiler sources?
  • goal is to compile the haskell library natively on macOS
    • need a slightly modified compiler
      • needs to have
        • –disable-large-address-space
        • set during configure
    • this will prvent the runtime from trying to grab
      • 1T of address space during initalisation
        • thi8s will fail on iOS, as iOS is not as permissable as macoS
    • flake.nix for this is added to the repository
    • haskell.nix used to turn cabal project into something nix can use
    • flake-utils to multiplex this across various targets
      • a bit extra glue to have cross targets as well
    • some postinstall packaging hooks, to have hydra provide .zip files containing the library
  • https://ci.zw3rk.com/eval/374#tabs-still-succeed
    • file *.o
    • shasum *.o
    • $ hexdump -C test-ios.o > test-ios.o.hd
    • $ hexdump -C test-mac.o > test-mac.o.hd
    • $ otool -l test-ios.o > test-ios.o.otool
    • $ otool -l test-mac.o > test-mac.o.otool
    • $ diff -u test-ios.o.otool test-mac.o.otool
  • for iOS, can natively compile on macOS (aarch64/x86)
    • since both macOS and iOS use Mach-O
    • although within the Mach-O there are some headers specifying the platoform it was built on
      • can artificially modify the header to make it run
  • is ghc compiling with –sdk macos?
  • when would this cause problems
    • for safety wouldn’t be too much issue to compile with iphone sdk
  • $ mac2ios Libraries/libHSmobile-core-0.1.0.0-HfUuggbqw4DC9ci8Blc8Tf-ghc8.10.7.a
  • $ mac2ios Libraries/libffi.a
  • $ mac2ios Libraries/libgmp.a
    • need to diable bitcode
  • haskell.nix must have something for the cabal wrapper and working with xcode

Nix: Reflex-FRP, haskell.nix

Tools For Inspecting Nix Expressions

https://github.com/utdemir/nix-tree

nix-store -q –referrers nix-store -q –references

nix-store –query

nix-store –query –tree nix/store.....drv

nix-store –query –referrers-closure nix/store.....drv

nix-store –query –tree nix/store.....

nix-store –query –referrers-closure nix/store.....

nix-store –query nix/store.....

nix show-derivation /nix/store/z3hhlxbckx4g3n9sw91nnvlkjvyw754p-myname.drv

nix-store -r /nix/store/z3hhlxbckx4g3n9sw91nnvlkjvyw754p-myname.drv

nix-diff

tree

Reflex-FRP

https://github.com/obsidiansystems/obelisk

Commands

nix-instantiate -A android.frontend

/nix/store/wb9xmaza1fym7rxz1srpc23c9gifkbib-android-app.drv

nix show-derivation /nix/store/wb9xmaza1fym7rxz1srpc23c9gifkbib-android-app.drv

nix-query-tree-viewer nix-tree

When run on derivation, can see the other input derivations. When run on store path, can see the runtime deps, although as of now not sure of the significance of this.K

What am I looking for?

  • How does obelisk work?
    • How does it compile the cabal project?
      • What input derivations are used to create the compiler?
      • What input derivations are used to create the build cabal?
      • What flags and settings does cabal take to create the output?
      • What are the outputs of the compiled code?
    • How does it build an android app?
      • How does it use the Haskell outputs to link with the android app?
      • What android boiler plate is there to create the app?
      • What are the commands and tools used to compile the app?
      • How does it package the android app that’s ready to be pushed onto the device?

How am I going to find this out?

  • Have the derivations available.
    • This will tell me the raw inputs that make up a certain output/derivation.
      • Build Script
      • Source Code
      • Command Line Tools and Libraries
    • In particular, the build script will reveal the commands that are run for each step and will reveal critical information.
      • The environemnt variables used in the build script can be used with the ‘nix show-derivation’ command.
    • However this doesn’t really tell me how the nix expressions are conceptually structured.
      • Since nix expression syntactic structure don’t necessarily have to structurally correspond to derivations that it generates.
  • Will perhaps have to dig through the nix expressions of the sample project.
    • Perhaps the nix expression digging can be lead by derivation digging.

Derivation Inspection

nix-query-tree-viewer /nix/store/wb9xmaza1fym7rxz1srpc23c9gifkbib-android-app.drv
nix show-derivation /nix/store/wb9xmaza1fym7rxz1srpc23c9gifkbib-android-app.drv

“builder”: “/nix/store/506nnycf7nk22x7n07mjjjl2g8nifpda-bash-4.4-p23/bin/bash”, “args”: [ “-e”, “/nix/store/9krlzvny65gdc8s7kpb6lkx8cd02c25b-default-builder.sh” ],

“env”: { “buildCommand”: “mkdir -p "$out/bin"\ncp -r "$src"/* "$out"\nsubstitute /nix/store/x3h06wmikhw3z05mqx84aghb9ydg5bk4-deploy.sh $out/bin/deploy \\n –subst-var-by coreutils /nix/store/3kqc2wmvf1jkqb2jmcm7rvd9lf4345ra-coreutils-8.31 \\n –subst-var-by adb /nix/store/6n00n2l97323dmrx63bm6ssf4ls0f5lx-platform-tools-29.0.6 \\n –subst-var-by java /nix/store/6yicryn6ycbl8ipc5np67gkn2y5a60is-openjdk-12.0.2-ga \\n –subst-var-by out $out\nchmod +x "$out/bin/deploy"\n”, “buildInputs”: “/nix/store/jix2yw3rx0xp56lzfgl33llgakv01pb7-androidsdk”, “builder”: “/nix/store/506nnycf7nk22x7n07mjjjl2g8nifpda-bash-4.4-p23/bin/bash”, … }

cat /nix/store/506nnycf7nk22x7n07mjjjl2g8nifpda-bash-4.4-p23/bin/bash

source $stdenv/setup genericBuild

“stdenv”: “/nix/store/lac6smkv5bgjf5ijiggdpiwx3jsrcypn-stdenv-linux”

$stdenv/setup contains the definition for genericBuild

genericBuild simply executes the buildCommand since it is defined in the env variables

genericBuild() {
    if [ -f "${buildCommandPath:-}" ]; then
        local oldOpts="-u"
        shopt -qo nounset || oldOpts="+u"
        set +u
        source "$buildCommandPath"
        set "$oldOpts"
        return
    fi
    if [ -n "${buildCommand:-}" ]; then
        local oldOpts="-u"
        shopt -qo nounset || oldOpts="+u"
        set +u
        eval "$buildCommand"
        set "$oldOpts"
        return
    fi
mkdir -p \"$out/bin\"\ncp -r \"$src\"/* \"$out\"\nsubstitute /nix/store/x3h06wmikhw3z05mqx84aghb9ydg5bk4-deploy.sh $out/bin/deploy \\\n  --subst-var-by coreutils /nix/store/3kqc2wmvf1jkqb2jmcm7rvd9lf4345ra-coreutils-8.31 \\\n  --subst-var-by adb /nix/store/6n00n2l97323dmrx63bm6ssf4ls0f5lx-platform-tools-29.0.6 \\\n  --subst-var-by java /nix/store/6yicryn6ycbl8ipc5np67gkn2y5a60is-openjdk-12.0.2-ga \\\n  --subst-var-by out $out\nchmod +x \"$out/bin/deploy\"\n
mkdir -p $out/bin
cp -r $src/* $out
substitute /nix/store/x3h06wmikhw3z05mqx84aghb9ydg5bk4-deploy.sh $out/bin/deploy
--subst-var-by coreutils /nix/store/3kqc2wmvf1jkqb2jmcm7rvd9lf4345ra-coreutils-8.31
--subst-var-by adb /nix/store/6n00n2l97323dmrx63bm6ssf4ls0f5lx-platform-tools-29.0.6
--subst-var-by java /nix/store/6yicryn6ycbl8ipc5np67gkn2y5a60is-openjdk-12.0.2-ga
--subst-var-by out $out
chmod +x $out/bin/deploy

Mainly just copies the stuff in src into the output. “src”: “/nix/store/q3f6za3ajl240lqp87dvc5nbajqav2sg-systems.obsidian.obelisk.examples.minimal”, This is the android.apk app

nix-store –query –deriver /nix/store/q3f6za3ajl240lqp87dvc5nbajqav2sg-systems.obsidian.obelisk.examples.minimal

This gives us the derivation: /nix/store/dnq3ydsndizynm8l5nqfvz0nx38zckyd-systems.obsidian.obelisk.examples.minimal.drv

nix-query-tree-viewer /nix/store/dnq3ydsndizynm8l5nqfvz0nx38zckyd-systems.obsidian.obelisk.examples.minimal.drv
nix show-derivation /nix/store/dnq3ydsndizynm8l5nqfvz0nx38zckyd-systems.obsidian.obelisk.examples.minimal.drv

The builder is also just a genericBuild

The derivation has a buildPhase env variable defined.

\nbuildDir=`pwd`\ncp -rL $ANDROID_HOME $buildDir/local_sdk\nchmod -R 755 local_sdk\nexport ANDROID_HOME=$buildDir/local_sdk/android-sdk\n# Key files cannot be stored in the user's home directory. This\n# overrides it.\nexport ANDROID_SDK_HOME=`pwd`\n\nmkdir -p \"$ANDROID_HOME/licenses\"\necho -e \"\\n8933bad161af4178b1185d1a37fbf41ea5269c55\" > \"$ANDROID_HOME/licenses/android-sdk-license\"\necho -e \"\\n84831b9409646a918e30573bab4c9c91346d8abd\" > \"$ANDROID_HOME/licenses/android-sdk-preview-license\"\n\nexport APP_HOME=`pwd`\n\nmkdir -p .m2/repository\nif [ -d \"$DEPENDENCIES/m2\" ] ; then\n  cp -RL \"$DEPENDENCIES\"/m2/* .m2/repository/\nfi\nchmod -R 755 .m2\nmkdir -p .m2/repository/com/android/support\ncp -RL local_sdk/android-sdk/extras/android/m2repository/com/android/support/* .m2/repository/com/android/support/\ngradle assembleDebug --offline --no-daemon -g ./tmp -Dmaven.repo.local=$(pwd)/.m2/repository\n
buildDir=`pwd`
cp -rL $ANDROID_HOME $buildDir/local_sdk
chmod -R 755 local_sdk
export ANDROID_HOME=$buildDir/local_sdk/android-sdk
# Key files cannot be stored in the user's home directory. This
# overrides it.
export ANDROID_SDK_HOME=`pwd`

mkdir -p $ANDROID_HOME/licenses
echo -e
8933bad161af4178b1185d1a37fbf41ea5269c55 > $ANDROID_HOME/licenses/android-sdk-license
echo -e
84831b9409646a918e30573bab4c9c91346d8abd > $ANDROID_HOME/licenses/android-sdk-preview-license

export APP_HOME=`pwd`

mkdir -p .m2/repository

if [ -d $DEPENDENCIES/m2 ] ; then
    cp -RL $DEPENDENCIES/m2/* .m2/repository/
fi
chmod -R 755 .m2
mkdir -p .m2/repository/com/android/support
cp -RL local_sdk/android-sdk/extras/android/m2repository/com/android/support/* .m2/repository/com/android/support/

gradle assembleDebug --offline --no-daemon -g ./tmp -Dmaven.repo.local=$(pwd)/.m2/repository

The script sets up the paths for gradle to build.

Namely it sets up maven dependenices from the nix store and then uses them to build the gradle project Maven deps: /nix/store/j9iaycd4hnsylhgi7kkpy77m1b46k7hz-systems.obsidian.obelisk.examples.minimal-maven-deps

Probably at this point good idea to find where these build scripts correspond to in the nix expressions. what does build inputs do? Probably something for stdenv https://nixos.org/manual/nixpkgs/stable/#ssec-stdenv-dependencies

buildPhase is just one of the phases that gets executed in the genericBuild

installPhase is also defined

mkdir -p $out
cp -RL build/outputs/apk/*/*.apk $out

first comes unpackPhase

  • it uses the $src/srcs env variable

src is /nix/store/cvwszjih0wrm8fayfvhdb272dk7s9j3d-android-app /nix/store/62w6j6dsbn5mjr21kvpwx6px7dzzwkwf-android-app.drv This contains the java source code. It also contians the compiled haskell libraries

Maven Dependencies are these: /nix/store/j9iaycd4hnsylhgi7kkpy77m1b46k7hz-systems.obsidian.obelisk.examples.minimal-maven-deps

nix show-derivation /nix/store/62w6j6dsbn5mjr21kvpwx6px7dzzwkwf-android-app.drv

this one has a bunch of input derivations, some input derivations look like the compiled results of the haskell code. buildCommand also something to look into

buildCommand:

cp -r --no-preserve=mode $src $out
mkdir -p $out/src/main
cp -r --no-preserve=mode $javaSrc $out/src/main/java
ln -s $buildGradle $out/build.gradle
ln -s $androidManifestXml $out/AndroidManifest.xml
mkdir -p $out/res/values
ln -s $stringsXml $out/res/values/strings.xml
mkdir -p $out/jni
ln -s $applicationMk $out/jni/Application.mk

{
    ARCH_LIB=$out/lib/arm64-v8a
    mkdir -p $ARCH_LIB
    local exe=/nix/store/s4qvcl4z6z8g3nmvwnydj5gwgviaqkn7-frontend-0.1-aarch64-unknown-linux-android/bin/libfrontend.so
    if [ ! -f $exe ] ; then
	  >&2 echo 'Error: executable \"frontend\" not found'
	  exit 1
    fi
    cp --no-preserve=mode $exe $ARCH_LIB/libHaskellActivity.so

}
....

Probably can look this up in the nix expressions

/nix/store/m52s5zrg4vdb4v7ll54p27ipnyrh6d71-frontend-0.1-armv7a-unknown-linux-androideabi.drv

/nix/store/rjw2armia65m1w5bhc8vjlqamxvnn5ia-frontend-0.1-aarch64-unknown-linux-android.drv

nix show-derivation /nix/store/rjw2armia65m1w5bhc8vjlqamxvnn5ia-frontend-0.1-aarch64-unknown-linux-android.drv

This is the derivation that compiles the haskell libraries.

Encountered this: https://github.com/wavewave/nix-build-ghc-android

Could clone reflex-frp obsidian to do greps over.

nixpkgs/pkgs/development/haskell-modules/generic-builder.nix looks like it has work done for cross compilation

How does it link up the android project files with the haskell outputs?

The derivation responsible for compiling the haskell is this one: /nix/store/rjw2armia65m1w5bhc8vjlqamxvnn5ia-frontend-0.1-aarch64-unknown-linux-android.drv

Maybe the derivation that links the build outputs is the previous one looked at: /nix/store/62w6j6dsbn5mjr21kvpwx6px7dzzwkwf-android-app.drv

reflex-platform/android/impl.nix contains code partly responsible for the android-app.drv reflex-platform/android/default.nix

New Questions and Goals

  • How does it handle Template Haskell?
    • It probably uses the solution of:
      • nixpkgs/pkgs/development/haskell-modules/generic-builder.nix
      • /nix/store/rjw2armia65m1w5bhc8vjlqamxvnn5ia-frontend-0.1-aarch64-unknown-linux-android.drv
        • The derivation might have TH hints.
  • How can I have a separate haskell app.
    • And then java/kotlin code for andoird
      • Where do I pass java code for android?
    • Swift code for iOS.

reflex-platoform mentions Template Haskell, although no iserv

Mentions of loading and saving splices: reflex-platform/haskell-overlays/splices-load-save