Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

[RFC] Testing and releasing amaranth-boards #128

Open
whitequark opened this issue Nov 25, 2020 · 11 comments
Open

[RFC] Testing and releasing amaranth-boards #128

whitequark opened this issue Nov 25, 2020 · 11 comments
Labels

Comments

@whitequark
Copy link
Member

whitequark commented Nov 25, 2020

There are currently two problems with amaranth-boards CI/tests:

  • They are useless (i.e. don't find actual issues);
  • They are broken (i.e. don't pass when there is no issue).

There is currently one problem with nmigen-boards releases:

  • They aren't.

The plan I have for amaranth-boards is to release every commit (that passes checks) on PyPI. There's simply no other option besides giving up and asking downstream to use git-dependencies:

  • The boards are almost all independent, but it would be wildly impractical to have a PyPI package per board.
  • Both individual boards and nmigen-boards as a whole lack anything remotely close to the concept of a "release".
  • Changes to existing boards should be released as soon as practical because they are either bugfixes (very important!) or additions of missing resources (pretty important too).

Technically, doing this is easy. But there are two issues that arise as a result.

  1. What should be amaranth-boards' version number?
  2. What version of amaranth should amaranth-boards depend on?

The first question is easy to answer. Since all of the boards are shipped together (and with resources, too), any breaking change to any board or any resource means bumping the major version. We'll end up with a major version in the hundreds very quickly, but that's OK. This allows us to do sweeping changes (like extracting resources) when necessary, and still have a version number that is more useful than a git hash.

The second question is more tricky. Since every commit to amaranth-boards becomes a release, it means that amaranth-boards should, at worst, depend on the latest released version of nmigen. But this means that, naively, adding a new platform to amaranth or changing a corner case of an existing one (something to do with oscillator instances for example) requires an amaranth release, since otherwise amaranth-boards would be broken (like it currently is). This is obviously impractical.

I propose a novel solution that I will call "fine-grained dependencies". Most of the boards in amaranth-boards are completely static (as they should be: the schematic is fixed, and the resources should not be changed on a whim) and so they only depend on old features. But some boards, particularly recent additions, will change a lot or require extremely recent features. Such boards should call some kind of function (or use any other mechanism) that would request the nMigen version through importlib_metadata and raise an ImportError if nMigen is too old.

This serves a twofold purpose:

  • Downstream users get a nice error message explaining why the board file can't be used, instead of an opaque crash that looks like a bug.
  • Our CI gains a way to ignore these boards when testing against latest nmigen release, yet ensure they pass tests when testing against latest nmigen snapshot.

We also should actually thoroughly test the boards and not just check that they import without errors, using some or all of the following strategies:

  1. Make sure that boards with LEDs can translate blinky to Verilog.
  2. Make sure that boards with LEDs and FOSS toolchains can compile blinky to bitstream.
  3. Make sure that every board uses only non-overlapping pins in resources. This will probably require some heuristics and/or DSL improvements to account for alternate resource variants--think spi_flash vs spi_flash_2x. "Pin overlap is OK if there is a shared name prefix and an identical number" seems promising.
  4. Make sure that every board with a FOSS toolchain uses only pins that exist.
  5. The same as (3) and (4) but taking connectors into account.
@whitequark whitequark added the rfc label Nov 25, 2020
@rroohhh
Copy link
Contributor

rroohhh commented Nov 25, 2020

For boards with FOSS toolchains it would be nice to check additional properties, like using a valid IOSTANDARD (existing and for example using a differential standard for differential pins), using non-conflicting IOSTANDARDs for pins on the same bank, and furthermore check if all the Attrs are valid.
This opens the question about how to handle boards that have multiple different configurations (like the Atlys board changing IOSTANDARDs depending on a constructor argument).

@whitequark
Copy link
Member Author

It would indeed be nice, but I think we should ship something much simpler first.

@mithro
Copy link

mithro commented Nov 25, 2020

Such boards should call some kind of function (or use any other mechanism) that would request the nMigen version through importlib_metadata and raise an ImportError if nMigen is too old.

I'm pretty sure this is the wrong solution, but I'm going to suggest it anyway.

You could try to import all board modules catching import errors. If a board import fails with such an import error, replace the board with a class that just generates an exception if someone actually tries to use the board....

I have seen this idea go wrong in so many different ways (which is what makes me think it is the wrong solution), but if there is actually someone who could make it work in a sane and reliably, it would probably be @whitequark...

@mithro
Copy link

mithro commented Nov 25, 2020

If you are going the automated publishing to PyPi route, maybe the idea of just publishing every board into a seperate PyPi package works? That way users only depend directly on the boards they actually want to support (and thus a minimum nmigen version?).

You could also then have a nmigen-boards meta package which depends on them all for convenience or could even be more super fancy and then also have different meta package "groupings" of nmigen-boards (like nmigen-boards-vivado -- any board which works with the Vivado platform) or something....

@whitequark
Copy link
Member Author

I have seen this idea go wrong in so many different ways (which is what makes me think it is the wrong solution), but if there is actually someone who could make it work in a sane and reliably, it would probably be @whitequark...

@mithro Hooking into the board import code path is basically what I want to do, except, as you correctly note, in a sane and reliable way.

If you are going the automated publishing to PyPi route, maybe the idea of just publishing every board into a seperate PyPi package works? That way users only depend directly on the boards they actually want to support (and thus a minimum nmigen version?).

I have of course considered that, but it's a logistical nightmare: it increases the workload required to add a new board several times over, it requires some sort of complicated automatic versioning scheme (because there is no way that can be managed manually), it will annoy our users (especially those who install the full package), it will no doubt expose new and exciting edge cases in the pip dependency resolver, and so on.

Sure, the idea sounds reasonable at first glance, but... even the 4 PyPI packages nmigen is currently split into present a noticeable maintenance burden. YoWASP, which is a dozen packages, is still a greater burden even in spite of me automating every action I could script and have it run reliably. Split nmigen-boards would be nothing short of a total nightmare.

@whitequark
Copy link
Member Author

Actually, I forgot the part where I would have to manually click through checks repository 50 (and growing) PyPI packages if I ever want to add a second maintainer. PyPI doesn't have any sort of organizations or group accounts, so every time I need to do something with YoWASP on Test PyPI after it gets wiped, I get dangerously close to carpal tunnel syndrome.

@mithro
Copy link

mithro commented Nov 25, 2020

I'm guessing what you are really saying is that PyPi doesn't provide a package management API so you have to do all that manually?

@whitequark
Copy link
Member Author

I'm guessing what you are really saying is that PyPi doesn't provide a package management API so you have to do all that manually?

Yes, although there is also the problem that it greatly multiplies the amount of possible package conflicts. (In particular this scheme may not be used until the entire Python ecosystem migrates to the new pip dependency resolver.)

@sporniket
Copy link

Hello, since I finally got myself an actual FPGA dev board, a so called "Colorlight i9" with its extended board, which is not bundled yet, I was trying to do yet another one (there are some PRs though)

For this topic, I am thinking that the "amaranth-board" repository is a mix of two things :

  • a specific api : ressources and extensions (that for now looks like some other ressources, i have yet to see why it is separate)
  • the board definitions

I would separate them. The api part would either be an independant library, or preferably reintegrate amaranth to leverage the release.

The "amaranth_board" would then be only a bunch of definitions of boards, added and updated/fixed on demand (PRs). Since it would not be a set of features, but more a collection of COTS, I would drop any numbering scheme and use a timestamp (like '2022.09.17')

That leaves the "test" (actually just 'blinky'). I see it more like a demo code that helps to verify that the targeted resource is correctly implemented. Thus I would extends this part so that each kind of resource has a typical demo code (some are easy to devise like led and GPIO, others would be more difficult, and except the onboard leds, any other test would require some test rigs that the user would have to build and connect to the board). And then for each board, the main execution would generate each relevant demo gateware, to be uploaded by the user for verification.

@whitequark
Copy link
Member Author

I would separate them. The api part would either be an independant library, or preferably reintegrate amaranth to leverage the release.

Right now these are often updated in lockstep with the boards, which I think is important. This is essentially the "monorepo" approach to updating dependencies. It arguably doesn't scale.

That leaves the "test" (actually just 'blinky'). I see it more like a demo code that helps to verify that the targeted resource is correctly implemented. Thus I would extends this part so that each kind of resource has a typical demo code (some are easy to devise like led and GPIO, others would be more difficult, and except the onboard leds, any other test would require some test rigs that the user would have to build and connect to the board). And then for each board, the main execution would generate each relevant demo gateware, to be uploaded by the user for verification.

It would be very nice to have something like that available. Unfortunately I'm not sure if we'll ever find people who can spend the effort of maintaining such test code for a very wide array of boards.

@sporniket
Copy link

I plan to to something for testing my board's GPIOs. A blinky parametrized with the connector name / pin name. When I have something, I will post a PR of this demo for sure.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
Development

No branches or pull requests

4 participants