Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Support cross-compiling in builder #173

Closed
nschoellhorn opened this issue May 23, 2021 · 7 comments · Fixed by #176
Closed

Support cross-compiling in builder #173

nschoellhorn opened this issue May 23, 2021 · 7 comments · Fixed by #176

Comments

@nschoellhorn
Copy link

I am working on my operating system on an Apple Silicon M1 MacBook (aarch64) from time to time and when I try to build the boot image, it always fails with the message This crate only supports the x86_64 architecture. because it uses the current architecture, of course. I ended up cloning the repo and adding the TARGET environment variable to the builder process to make it cross-compile for x86_64, which works for me but I'm not sure if that's the best way to do this.

Maybe the crate could get a config option for cross-compiling or something? Or always force a specific target? I'm not sure what issues might arise from this, so I thought I'd submit this issue first to discuss stuff.

If we can agree on a good solution, I'd be happy to submit a PR for that! :)

Thanks for your work!

@phil-opp
Copy link
Member

Thanks for reporting! Could you be more specific about the error that occurs and how you were able to fix it. I tried looking through your bmos repo and found this line, but I'm not sure if that's what you meant.

@nschoellhorn
Copy link
Author

Oh well, sorry, thought it would be obvious 😅

So what I get on my M1 MacBook is the error thrown here:

compile_error!("This crate only supports the x86_64 architecture.");

Actually, that line you found was one try, but that didn't work. What I ended up doing on my MacBook, is changing the builder alias in the bootloader crate to be run --target=x86_64-apple-darwin --bin builder --features builder --quiet -- (instead of invoking without a specific target).

Now that I keep thinking about that, it doesn't really make sense to enforce the target architecture being x86 for the builder itself, does it? What should be x86 is the target that the builder builds for, right? And since the target for the bootloader is hardcoded anyways, I don't really see a point in the compile-time check for x86 architecture. Am I missing something?

@phil-opp
Copy link
Member

Thanks for clarifying! So using the x86_64-apple-darwin target works because it passes the architecture check and is still able to run on your M1 (via emulation)?

Now that I keep thinking about that, it doesn't really make sense to enforce the target architecture being x86 for the builder itself, does it? What should be x86 is the target that the builder builds for, right?

Yes, it doesn't make sense to force an architecture for the builder executable. Seems like I forgot to adjust the check when I added the builder.

And since the target for the bootloader is hardcoded anyways, I don't really see a point in the compile-time check for x86 architecture. Am I missing something?

I want to prevent people from trying to use the bootloader crate for their ARM kernel. (The library part of the bootloader is also built for the kernel target.) So I would like to keep the check, but we can of course disable it when the builder binary is built. I'll look into it, thanks again for reporting!

@nschoellhorn
Copy link
Author

Thanks for clarifying! So using the x86_64-apple-darwin target works because it passes the architecture check and is still able to run on your M1 (via emulation)?

Yes, exactly. The image is still built for x86 and then I run it with a custom-patched qemu-system-x86_64, which works well :D
Debugging is quite a hassle still, since there's no GDB for aarch64, but using the bundled GDB multiarch that comes with CLion does the job most of the time 😅

@phil-opp
Copy link
Member

I created a PR with a potential fix in #176. Could you try whether it fixes your problem @nschoellhorn?

@nschoellhorn
Copy link
Author

Works like a charm, thank you so much for the quick reaction!

@phil-opp
Copy link
Member

Awesome, thanks for testing!

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging a pull request may close this issue.

2 participants