Writing a digital logic simulator - Part 10
Parcel integration
Introduction
Links to previous posts: Part 1 Part 2 Part 3 Part 4 Part 5 Part 6 Part 7 Part 8 Part 9
I think it’s time to automate the part links…
Until now I have been bundling all the JavaScript dependencies manually, but it
looks like there is an increasing number of interesting libraries that would
improve this project, so migrating to npm
and using a bundler would be a
great improvement.
Parcel
Parcel is a bundler. What is a bundler? You may think that just by using JavaScript your coce will happily run on the web, but that’s not the case. Following the invention of Node, server-side JavaScript code has become a thing. A bundler basically replaces all the Node-specific features with hacks to make the code work in most browsers.
Why parcel? The main and only reason is that parcel has Rust support and there is an stdweb plugin for it.
This means that now building the demo will be as simple as
parcel static/index.html
At least in theory. And after setting everything up. The Parcel slogan is:
Blazing fast, zero configuration web application bundler
So the main selling points are speed and zero-configuration.
I disagree on speed because my previous strategy, using <script scr=path>
for importing modules was faster than any bundler, but it obviously had too
many downsides.
And zero-conf sounds cool, parcel “just works”, but what happens when it
doesn’t? You delete the cache and try again, and it magically works.
So ideally parcel should work out of the box, but since it is still a work in
progress, it can have some issues. There are some flags documented when running
parcel help build
.
For me the most important ones were:
--log-level <level> set the log level, either "0" (no output), "1" (errors), "2" (warnings), "3" (info), "4" (verbose) or "5" (debug, creates a log file).
--detailed-report print a detailed build report after a completed build
--public-url <url> set the public URL to serve on. defaults to "/"
Basically more logs, very helpful if travis fails, and an option to set the
public url. This is another problem that I didn’t have before, for some reason
parcel doesn’t like relative URLs. So once you deploy the code, you can’t move
it somewhere else. For example, I like to copy the working demos from the
nightly
folder into a vXX
folder, but I can’t do that because the code was
not deployed there. Again, in theory, because in practice I use dirty hacks to
search and replace the deploy url in all the files:
# Safely copy a nightly deploy to a vXX folder
TAG_VERSION="v10"
NIGHTLY_URL="/comphdl/demo/nightly"
TAG_URL="/comphdl/demo/$TAG_VERSION"
cp -rf demo/nightly demo/$TAG_VERSION
find demo/$TAG_VERSION -type f -print0 | xargs -0 sed -i "s,$NIGHTLY_URL,$TAG_URL,g"
More problems:
No support for bundling a folder.
I have to manually import all the comphdl examples and svg files.
See commit.
Well, the comphdl examples are manually copied using a cp
command because
I didn’t want to bother.
Parcel tries to remove global variables, which is a good thing, but it breaks
my setup where I can do whatever I want from Rust code using the js!
macro.
But don’t worry we just need to import the desired modules into the Rust
code… more on that later.
Another think that broke is the dirty hack of using a style tag to set the
signal colors. Parcel thought that an empty style tag was dead code and removed
it.
See commit.
But eventually I took the hint and removed that hack, replacing the style tag
with a querySelectorAll
and a for loop.
See commit.
So somehow using Parcel forces me to write half-decent JavaScript code. And most importantly, now everything is configured and should “just work”, so we can add all the npm packages we want without worrying about anything.
npm
I did not enjoy working with npm. Maybe it’s because I’m used to cargo?
Well, the most important command is npm install
, which has two meanings:
-
npm install
: Download and install the dependencies required by this project. -
npm install foo
: Download and install packagefoo
from the npm repository.
I even run into an issue when importing projects from a folder:
https://github.com/npm/npm/issues/13528#issuecomment-396522166
And another issue when running npm install
, npm
will replace https
with
http
in package-lock.json
(using http for downloading dependencies?
really?)
https://npm.community/t/some-packages-have-dist-tarball-as-http-and-not-https
And the accepted solution is literally purge the cache and reinstall:
rm -rf ./node_modules
npm cache clean --force
npm install --prefer-online
And for some reason npm install doesn’t work on travis, and I have to run it
locally and commit package-lock.json
. Therefore, when I want to update
the dependencies I just run:
rm package-lock.json
npm install
git commit -am 'Update dependencies'
Although there is one nice think about npm
: GitHub automatically notified me
about a vulnerable dependency, so I was able to upgrade it before… before
what? I’m not running it server-side, so the worst that can happen is a XSS
vulnerability, right?
JavaScript
The process of removing the bundled dependencies and replacing them with a simple “import” was somewhat rewarding, but figuring out the import syntax was a challenge.
Take a look at the first lines of index.js
:
'use strict';
import comphdl from "../comphdl_web/Cargo.toml";
import * as ace from 'brace';
import 'brace/theme/tomorrow';
import 'brace/mode/rust';
import { Terminal } from 'xterm';
import * as WaveDromFastUpdate from './wavedrom_fast_update.js'
//import { WaveDrom } from 'wavedrom';
window.WaveSkin = require('wavedrom/skins/narrow.js');
var WaveDrom = require('wavedrom');
var Stats = require('stats.js');
What’s the difference between:
import name from path;
import * as name from path;
import { Name } from path;
import path;
var Name = require(path);
(This sounds like an interview question). Well, I don’t know but
my favorites are import * as name from path;
and var Name = require(path)
,
the others don’t work most of the time.
One potentially amazing feature is dynamic imports: have a faster initial load by lazyly loading the optional features. This would be really helpful if we manage to split the Wasm file (right now it weights more than 1 MiB).
But guess what, dynamic imports have another different syntax: import(path)
,
which returns a promise.
I was forced to use dynamic import in the Rust code, because Parcel disables
globals, so I need to import index.js
in order to use the term2
variable
and print to xtermjs. This is what it looks like:
js!{
import("/index.js").then(function(p) {
p.term2.writeln(@{message});
});
}
A slightly better alternative would be:
js!{
const p = await import("/index.js");
p.term2.writeln(@{message});
}
But guess what:
Compilation failed!
error[E0721]: `await` is a keyword in the 2018 edition
--> comphdl_web/src/stdweb_logger.rs:90:23
|
90 | const p = await import("/index.js");
| ^^^^^ help: you can use a raw identifier to stay compatible: `r#await`
Well, let’s follow the suggestion and replace await
with r#await
…
Unexpected character '#' (586:108)
584 | },
585 | "__cargo_web_snippet_f4c9520f1825c5ec4160ff3ce53d2c37e38abadf": function($0, $1) {
> 586 | $1 = Module.STDWEB_PRIVATE.to_js($1);Module.STDWEB_PRIVATE.from_js($0, (function(){const p=r#await import("/index.js");p.term2.writeln(($1));})());
| ^
587 | },
Ok, I give up.
Let’s rant a bit about JavaScript. My favorite features are equality checking and global variables:
function foo() {
for(i=0; i<10; i++) {}
}
var a = [1, 2];
var i = [1, 2];
console.log(a == i); // false
console.log(i); // [1, 2]
foo();
console.log(i); // 10
Rust 2018
Alright, enough JavaScript, let’s talk about Rust.
We begin 2019 with the 2018 edition!
The migration was pretty straight-forward: run cargo fix --edition
, add
edition = "2018"
to each Cargo.toml
, and optionally run cargo fix
--edition-idioms
.
See commits.
I’m mostly excited about the new module system:
no more extern crate
!
We can now compile to wasm32-unknown-unknown
on stable Rust, which means no
nightly needed for stdweb anymore!
core, cli, web
I decided to split the Rust code into three crates:
-
comphdl-core
contains the actual library: parser and simulator -
comphdl-cli
contains the binary, the main function -
comphdl-web
contains the Rust code with generated bindings to be easily usable from JS
In the future I would like to make a separate repo for the JavaScript code, as that should allow easier CI and testing.
Deploy
I quickly hacked a script to deploy the parcel builds to github pages, but keeps the old versions of the demo.
However I did not take into account that git will keep all the commit history,
so if the demo is ~10MiB, then for every git push
to master, the repo size
will grow by about 10MiB (unless the files are exactly the same).
Therefore I changed it so it rewrites the history of the gh-pages
branch,
meaning that only one nightly
build will be stored:
NODE_VERSION="10"
ci/create_all_branches.sh &&
ci/install_cargo_web.sh &&
source ~/.nvm/nvm.sh &&
nvm install $NODE_VERSION &&
npm install -g parcel-bundler &&
npm install &&
ci/parcel.sh &&
git checkout gh-pages &&
git add demo/nightly &&
git commit -ammend -qm 'Nighlty demo' &&
# Force push
git push -fq https://${GH_TOKEN}@github.com/${TRAVIS_REPO_SLUG}.git gh-pages &&
Changes to the demo
comphdl.svg skin
It turns out that netlistsvg supports custom skins, meaning that we can specify how to render every component:
For now the only differences between the “comphdl” skin and the original one are the customized icons for Not and Buf gates, but this opens the possibility to custom input buttons, custom components (leds, 7-segment displays), etc.
WaveDrom
Remember when we used GTKWave to explore the signals? It would be nice to have a similar tool on the web demo. I found WadeDrom which does exactly that: it parses a simple JSON which contains information about the waves. It is designed for static data, but I was able to hack it and implement animations. Therefore we can see a live wave diagram of the inputs and outputs of the top component! It is disabled by default because it consumes a non-negligible amout of resources, but it’s great as a starting point.
In the future I may implement a custom wave viewer optimized for my use case, but considering the pace of this project it is much more realistic to use an external library.
Let’s see one of the simplest logic gates:
Conclusion
As always, the code is available on GitHub .
If you are looking for a link to the demo, here it is:
https://badel2.github.io/comphdl/demo/v10
In order to build the web demo locally, install the required dependencies and
run ./ci/parcel.sh
.