mirror of
https://github.com/ankitects/anki.git
synced 2026-01-19 17:29:01 -05:00
* Fix .no-reduce-motion missing from graphs spinner, and not being honored
* Begin migration from protobuf.js -> protobuf-es
Motivation:
- Protobuf-es has a nicer API: messages are represented as classes, and
fields which should exist are not marked as nullable.
- As it uses modules, only the proto messages we actually use get included
in our bundle output. Protobuf.js put everything in a namespace, which
prevented tree-shaking, and made it awkward to access inner messages.
- ./run after touching a proto file drops from about 8s to 6s on my machine. The tradeoff
is slower decoding/encoding (#2043), but that was mainly a concern for the
graphs page, and was unblocked by
778e02415b
Approach/notes:
- We generate the new protobuf-es interface in addition to existing
protobuf.js interface, so we can migrate a module at a time, starting
with the graphs module.
- rslib:proto now generates RPC methods for TS in addition to the Python
interface. The input-arg-unrolling behaviour of the Python generation is
not required here, as we declare the input arg as a PlainMessage<T>, which
marks it as requiring all fields to be provided.
- i64 is represented as bigint in protobuf-es. We were using a patch to
protobuf.js to get it to output Javascript numbers instead of long.js
types, but now that our supported browser versions support bigint, it's
probably worth biting the bullet and migrating to bigint use. Our IDs
fit comfortably within MAX_SAFE_INTEGER, but that may not hold for future
fields we add.
- Oneofs are handled differently in protobuf-es, and are going to need
some refactoring.
Other notable changes:
- Added a --mkdir arg to our build runner, so we can create a dir easily
during the build on Windows.
- Simplified the preference handling code, by wrapping the preferences
in an outer store, instead of a separate store for each individual
preference. This means a change to one preference will trigger a redraw
of all components that depend on the preference store, but the redrawing
is cheap after moving the data processing to Rust, and it makes the code
easier to follow.
- Drop async(Reactive).ts in favour of more explicit handling with await
blocks/updating.
- Renamed add_inputs_to_group() -> add_dependency(), and fixed it not adding
dependencies to parent groups. Renamed add() -> add_action() for clarity.
* Remove a couple of unused proto imports
* Migrate card info
* Migrate congrats, image occlusion, and tag editor
+ Fix imports for multi-word proto files.
* Migrate change-notetype
* Migrate deck options
* Bump target to es2020; simplify ts lib list
Have used caniuse.com to confirm Chromium 77, iOS 14.5 and the Chrome
on Android support the full es2017-es2020 features.
* Migrate import-csv
* Migrate i18n and fix missing output types in .js
* Migrate custom scheduling, and remove protobuf.js
To mostly maintain our old API contract, we make use of protobuf-es's
ability to convert to JSON, which follows the same format as protobuf.js
did. It doesn't cover all case: users who were previously changing the
variant of a type will need to update their code, as assigning to a new
variant no longer automatically removes the old one, which will cause an
error when we try to convert back from JSON. But I suspect the large majority
of users are adjusting the current variant rather than creating a new one,
and this saves us having to write proxy wrappers, so it seems like a
reasonable compromise.
One other change I made at the same time was to rename value->kind for
the oneofs in our custom study protos, as 'value' was easily confused
with the 'case/value' output that protobuf-es has.
With protobuf.js codegen removed, touching a proto file and invoking
./run drops from about 8s to 6s.
This closes #2043.
* Allow tree-shaking on protobuf types
* Display backend error messages in our ts alert()
* Make sourcemap generation opt-in for ts-run
Considerably slows down build, and not used most of the time.
450 lines
14 KiB
Rust
450 lines
14 KiB
Rust
// Copyright: Ankitects Pty Ltd and contributors
|
|
// License: GNU AGPL, version 3 or later; http://www.gnu.org/licenses/agpl.html
|
|
|
|
use std::borrow::Cow;
|
|
use std::collections::HashMap;
|
|
|
|
use itertools::Itertools;
|
|
|
|
use super::*;
|
|
use crate::action::BuildAction;
|
|
use crate::archives::download_and_extract;
|
|
use crate::archives::OnlineArchive;
|
|
use crate::archives::Platform;
|
|
use crate::hash::simple_hash;
|
|
use crate::input::space_separated;
|
|
use crate::input::BuildInput;
|
|
|
|
pub fn node_archive(platform: Platform) -> OnlineArchive {
|
|
match platform {
|
|
Platform::LinuxX64 => OnlineArchive {
|
|
url: "https://nodejs.org/dist/v18.12.1/node-v18.12.1-linux-x64.tar.xz",
|
|
sha256: "4481a34bf32ddb9a9ff9540338539401320e8c3628af39929b4211ea3552a19e",
|
|
},
|
|
Platform::LinuxArm => OnlineArchive {
|
|
url: "https://nodejs.org/dist/v18.12.1/node-v18.12.1-linux-arm64.tar.xz",
|
|
sha256: "3904869935b7ecc51130b4b86486d2356539a174d11c9181180cab649f32cd2a",
|
|
},
|
|
Platform::MacX64 => OnlineArchive {
|
|
url: "https://nodejs.org/dist/v18.12.1/node-v18.12.1-darwin-x64.tar.xz",
|
|
sha256: "6c88d462550a024661e74e9377371d7e023321a652eafb3d14d58a866e6ac002",
|
|
},
|
|
Platform::MacArm => OnlineArchive {
|
|
url: "https://nodejs.org/dist/v18.12.1/node-v18.12.1-darwin-arm64.tar.xz",
|
|
sha256: "17f2e25d207d36d6b0964845062160d9ed16207c08d09af33b9a2fd046c5896f",
|
|
},
|
|
Platform::WindowsX64 => OnlineArchive {
|
|
url: "https://nodejs.org/dist/v18.12.1/node-v18.12.1-win-x64.zip",
|
|
sha256: "5478a5a2dce2803ae22327a9f8ae8494c1dec4a4beca5bbf897027380aecf4c7",
|
|
},
|
|
}
|
|
}
|
|
|
|
pub struct YarnSetup {}
|
|
|
|
impl BuildAction for YarnSetup {
|
|
fn command(&self) -> &str {
|
|
if cfg!(windows) {
|
|
"corepack.cmd enable yarn"
|
|
} else {
|
|
"corepack enable yarn"
|
|
}
|
|
}
|
|
|
|
fn files(&mut self, build: &mut impl build::FilesHandle) {
|
|
build.add_inputs("", inputs![":extract:node"]);
|
|
build.add_outputs_ext(
|
|
"bin",
|
|
vec![if cfg!(windows) {
|
|
"extracted/node/yarn.cmd"
|
|
} else {
|
|
"extracted/node/bin/yarn"
|
|
}],
|
|
true,
|
|
);
|
|
}
|
|
|
|
fn check_output_timestamps(&self) -> bool {
|
|
true
|
|
}
|
|
}
|
|
pub struct YarnInstall<'a> {
|
|
pub package_json_and_lock: BuildInput,
|
|
pub exports: HashMap<&'a str, Vec<Cow<'a, str>>>,
|
|
}
|
|
|
|
impl BuildAction for YarnInstall<'_> {
|
|
fn command(&self) -> &str {
|
|
"$runner yarn $yarn $out"
|
|
}
|
|
|
|
fn files(&mut self, build: &mut impl build::FilesHandle) {
|
|
build.add_inputs("", &self.package_json_and_lock);
|
|
build.add_inputs("yarn", inputs![":yarn:bin"]);
|
|
build.add_outputs("out", vec!["node_modules/.marker"]);
|
|
for (key, value) in &self.exports {
|
|
let outputs: Vec<_> = value.iter().map(|o| format!("node_modules/{o}")).collect();
|
|
build.add_outputs_ext(*key, outputs, true);
|
|
}
|
|
}
|
|
|
|
fn check_output_timestamps(&self) -> bool {
|
|
true
|
|
}
|
|
}
|
|
|
|
fn with_cmd_ext(bin: &str) -> Cow<str> {
|
|
if cfg!(windows) {
|
|
format!("{bin}.cmd").into()
|
|
} else {
|
|
bin.into()
|
|
}
|
|
}
|
|
|
|
pub fn setup_node(
|
|
build: &mut Build,
|
|
archive: OnlineArchive,
|
|
binary_exports: &[&'static str],
|
|
mut data_exports: HashMap<&str, Vec<Cow<str>>>,
|
|
) -> Result<()> {
|
|
let node_binary = match std::env::var("NODE_BINARY") {
|
|
Ok(path) => {
|
|
assert!(
|
|
Utf8Path::new(&path).is_absolute(),
|
|
"NODE_BINARY must be absolute"
|
|
);
|
|
path.into()
|
|
}
|
|
Err(_) => {
|
|
download_and_extract(
|
|
build,
|
|
"node",
|
|
archive,
|
|
hashmap! {
|
|
"bin" => vec![if cfg!(windows) { "node.exe" } else { "bin/node" }],
|
|
"npm" => vec![if cfg!(windows) { "npm.cmd " } else { "bin/npm" }]
|
|
},
|
|
)?;
|
|
inputs![":extract:node:bin"]
|
|
}
|
|
};
|
|
let node_binary = build.expand_inputs(node_binary);
|
|
build.variable("node_binary", &node_binary[0]);
|
|
|
|
match std::env::var("YARN_BINARY") {
|
|
Ok(path) => {
|
|
assert!(
|
|
Utf8Path::new(&path).is_absolute(),
|
|
"YARN_BINARY must be absolute"
|
|
);
|
|
build.add_dependency("yarn:bin", inputs![path]);
|
|
}
|
|
Err(_) => {
|
|
build.add_action("yarn", YarnSetup {})?;
|
|
}
|
|
};
|
|
|
|
for binary in binary_exports {
|
|
data_exports.insert(
|
|
*binary,
|
|
vec![format!(".bin/{}", with_cmd_ext(binary)).into()],
|
|
);
|
|
}
|
|
build.add_action(
|
|
"node_modules",
|
|
YarnInstall {
|
|
package_json_and_lock: inputs!["yarn.lock", "package.json"],
|
|
exports: data_exports,
|
|
},
|
|
)?;
|
|
Ok(())
|
|
}
|
|
|
|
pub struct EsbuildScript<'a> {
|
|
pub script: BuildInput,
|
|
pub entrypoint: BuildInput,
|
|
pub deps: BuildInput,
|
|
/// .js will be appended, and any extra extensions
|
|
pub output_stem: &'a str,
|
|
/// eg ['.css', '.html']
|
|
pub extra_exts: &'a [&'a str],
|
|
}
|
|
|
|
impl BuildAction for EsbuildScript<'_> {
|
|
fn command(&self) -> &str {
|
|
"$node_bin $script $entrypoint $out"
|
|
}
|
|
|
|
fn files(&mut self, build: &mut impl build::FilesHandle) {
|
|
build.add_inputs("node_bin", inputs!["$node_binary"]);
|
|
build.add_inputs("script", &self.script);
|
|
build.add_inputs("entrypoint", &self.entrypoint);
|
|
build.add_inputs("", inputs!["yarn.lock", ":node_modules", &self.deps]);
|
|
build.add_inputs("", inputs!["out/env"]);
|
|
let stem = self.output_stem;
|
|
let mut outs = vec![format!("{stem}.js")];
|
|
outs.extend(self.extra_exts.iter().map(|ext| format!("{stem}.{ext}")));
|
|
build.add_outputs("out", outs);
|
|
}
|
|
}
|
|
|
|
pub struct DPrint {
|
|
pub inputs: BuildInput,
|
|
pub check_only: bool,
|
|
}
|
|
|
|
impl BuildAction for DPrint {
|
|
fn command(&self) -> &str {
|
|
"$dprint $mode"
|
|
}
|
|
|
|
fn files(&mut self, build: &mut impl build::FilesHandle) {
|
|
build.add_inputs("dprint", inputs![":node_modules:dprint"]);
|
|
build.add_inputs("", &self.inputs);
|
|
let mode = if self.check_only { "check" } else { "fmt" };
|
|
build.add_variable("mode", mode);
|
|
build.add_output_stamp(format!("tests/dprint.{mode}"));
|
|
}
|
|
}
|
|
|
|
pub struct SvelteCheck {
|
|
pub tsconfig: BuildInput,
|
|
pub inputs: BuildInput,
|
|
}
|
|
|
|
impl BuildAction for SvelteCheck {
|
|
fn command(&self) -> &str {
|
|
"$svelte-check --tsconfig $tsconfig $
|
|
--fail-on-warnings --threshold warning --use-new-transformation $
|
|
--compiler-warnings $compiler_warnings"
|
|
}
|
|
|
|
fn files(&mut self, build: &mut impl build::FilesHandle) {
|
|
build.add_inputs("svelte-check", inputs![":node_modules:svelte-check"]);
|
|
build.add_inputs("tsconfig", &self.tsconfig);
|
|
build.add_inputs("", &self.inputs);
|
|
build.add_inputs("", inputs!["yarn.lock"]);
|
|
build.add_variable(
|
|
"compiler_warnings",
|
|
[
|
|
"a11y-click-events-have-key-events",
|
|
"a11y-no-noninteractive-tabindex",
|
|
]
|
|
.iter()
|
|
.map(|warning| format!("{}$:ignore", warning))
|
|
.collect::<Vec<_>>()
|
|
.join(","),
|
|
);
|
|
let hash = simple_hash(&self.tsconfig);
|
|
build.add_output_stamp(format!("tests/svelte-check.{hash}"));
|
|
}
|
|
}
|
|
|
|
pub struct TypescriptCheck {
|
|
pub tsconfig: BuildInput,
|
|
pub inputs: BuildInput,
|
|
}
|
|
|
|
impl BuildAction for TypescriptCheck {
|
|
fn command(&self) -> &str {
|
|
"$tsc --noEmit -p $tsconfig"
|
|
}
|
|
|
|
fn files(&mut self, build: &mut impl build::FilesHandle) {
|
|
build.add_inputs("tsc", inputs![":node_modules:tsc"]);
|
|
build.add_inputs("tsconfig", &self.tsconfig);
|
|
build.add_inputs("", &self.inputs);
|
|
build.add_inputs("", inputs!["yarn.lock"]);
|
|
let hash = simple_hash(&self.tsconfig);
|
|
build.add_output_stamp(format!("tests/typescript.{hash}"));
|
|
}
|
|
}
|
|
|
|
pub struct Eslint<'a> {
|
|
pub folder: &'a str,
|
|
pub inputs: BuildInput,
|
|
pub eslint_rc: BuildInput,
|
|
pub fix: bool,
|
|
}
|
|
|
|
impl BuildAction for Eslint<'_> {
|
|
fn command(&self) -> &str {
|
|
"$eslint --max-warnings=0 -c $eslint_rc $fix $folder"
|
|
}
|
|
|
|
fn files(&mut self, build: &mut impl build::FilesHandle) {
|
|
build.add_inputs("eslint", inputs![":node_modules:eslint"]);
|
|
build.add_inputs("eslint_rc", &self.eslint_rc);
|
|
build.add_inputs("in", &self.inputs);
|
|
build.add_inputs("", inputs!["yarn.lock", "ts/tsconfig.json"]);
|
|
build.add_variable("fix", if self.fix { "--fix" } else { "" });
|
|
build.add_variable("folder", self.folder);
|
|
let hash = simple_hash(self.folder);
|
|
let kind = if self.fix { "fix" } else { "check" };
|
|
build.add_output_stamp(format!("tests/eslint.{kind}.{hash}"));
|
|
}
|
|
}
|
|
|
|
pub struct JestTest<'a> {
|
|
pub folder: &'a str,
|
|
pub deps: BuildInput,
|
|
pub jest_rc: BuildInput,
|
|
pub jsdom: bool,
|
|
}
|
|
|
|
impl BuildAction for JestTest<'_> {
|
|
fn command(&self) -> &str {
|
|
"$jest --config $config $env $folder"
|
|
}
|
|
|
|
fn files(&mut self, build: &mut impl build::FilesHandle) {
|
|
build.add_inputs("jest", inputs![":node_modules:jest"]);
|
|
build.add_inputs("", &self.deps);
|
|
build.add_inputs("config", &self.jest_rc);
|
|
build.add_variable("env", if self.jsdom { "--env=jsdom" } else { "" });
|
|
build.add_variable("folder", self.folder);
|
|
let hash = simple_hash(self.folder);
|
|
build.add_output_stamp(format!("tests/jest.{hash}"));
|
|
}
|
|
}
|
|
|
|
pub struct SqlFormat {
|
|
pub inputs: BuildInput,
|
|
pub check_only: bool,
|
|
}
|
|
|
|
impl BuildAction for SqlFormat {
|
|
fn command(&self) -> &str {
|
|
"$tsx $sql_format $mode $in"
|
|
}
|
|
|
|
fn files(&mut self, build: &mut impl build::FilesHandle) {
|
|
build.add_inputs("tsx", inputs![":node_modules:tsx"]);
|
|
build.add_inputs("sql_format", inputs!["ts/sql_format/sql_format.ts"]);
|
|
build.add_inputs("in", &self.inputs);
|
|
let mode = if self.check_only { "check" } else { "fix" };
|
|
build.add_variable("mode", mode);
|
|
build.add_output_stamp(format!("tests/sql_format.{mode}"));
|
|
}
|
|
}
|
|
|
|
pub struct GenTypescriptProto<'a> {
|
|
pub protos: BuildInput,
|
|
pub include_dirs: &'a [&'a str],
|
|
/// Automatically created.
|
|
pub out_dir: &'a str,
|
|
/// Can be used to adjust the output js/dts files to point to out_dir.
|
|
pub out_path_transform: fn(&str) -> String,
|
|
/// Script to apply modifications to the generated files.
|
|
pub py_transform_script: &'static str,
|
|
}
|
|
|
|
impl BuildAction for GenTypescriptProto<'_> {
|
|
fn command(&self) -> &str {
|
|
"$protoc $includes $in \
|
|
--plugin $gen-es --es_out $out_dir && \
|
|
$pyenv_bin $script $out_dir"
|
|
}
|
|
|
|
fn files(&mut self, build: &mut impl build::FilesHandle) {
|
|
let proto_files = build.expand_inputs(&self.protos);
|
|
let output_files: Vec<_> = proto_files
|
|
.iter()
|
|
.flat_map(|f| {
|
|
let js_path = f.replace(".proto", "_pb.js");
|
|
let dts_path = f.replace(".proto", "_pb.d.ts");
|
|
[
|
|
(self.out_path_transform)(&js_path),
|
|
(self.out_path_transform)(&dts_path),
|
|
]
|
|
})
|
|
.collect();
|
|
|
|
build.create_dir_all("out_dir", self.out_dir);
|
|
build.add_variable(
|
|
"includes",
|
|
self.include_dirs
|
|
.iter()
|
|
.map(|d| format!("-I {d}"))
|
|
.join(" "),
|
|
);
|
|
build.add_inputs("protoc", inputs![":extract:protoc:bin"]);
|
|
build.add_inputs("gen-es", inputs![":node_modules:protoc-gen-es"]);
|
|
if cfg!(windows) {
|
|
build.add_env_var(
|
|
"PATH",
|
|
&format!("node_modules/.bin;{}", std::env::var("PATH").unwrap()),
|
|
);
|
|
}
|
|
build.add_inputs_vec("in", proto_files);
|
|
build.add_inputs("", inputs!["yarn.lock"]);
|
|
build.add_inputs("pyenv_bin", inputs![":pyenv:bin"]);
|
|
build.add_inputs("script", inputs![self.py_transform_script]);
|
|
|
|
build.add_outputs("", output_files);
|
|
}
|
|
}
|
|
|
|
pub struct CompileSass<'a> {
|
|
pub input: BuildInput,
|
|
pub output: &'a str,
|
|
pub deps: BuildInput,
|
|
pub load_paths: Vec<&'a str>,
|
|
}
|
|
|
|
impl BuildAction for CompileSass<'_> {
|
|
fn command(&self) -> &str {
|
|
"$sass -s compressed $args $in -- $out"
|
|
}
|
|
|
|
fn files(&mut self, build: &mut impl build::FilesHandle) {
|
|
build.add_inputs("sass", inputs![":node_modules:sass"]);
|
|
build.add_inputs("in", &self.input);
|
|
build.add_inputs("", &self.deps);
|
|
|
|
let args = space_separated(self.load_paths.iter().map(|path| format!("-I {path}")));
|
|
build.add_variable("args", args);
|
|
|
|
build.add_outputs("out", vec![self.output]);
|
|
}
|
|
}
|
|
|
|
/// Usually we rely on esbuild to transpile our .ts files on the fly, but when
|
|
/// we want generated code to be able to import a .ts file, we need to use
|
|
/// typescript to generate .js/.d.ts files, or types can't be looked up, and
|
|
/// esbuild can't find the file to bundle.
|
|
pub struct CompileTypescript<'a> {
|
|
pub ts_files: BuildInput,
|
|
/// Automatically created.
|
|
pub out_dir: &'a str,
|
|
/// Can be used to adjust the output js/dts files to point to out_dir.
|
|
pub out_path_transform: fn(&str) -> String,
|
|
}
|
|
|
|
impl BuildAction for CompileTypescript<'_> {
|
|
fn command(&self) -> &str {
|
|
"$tsc $in --outDir $out_dir -d --skipLibCheck"
|
|
}
|
|
|
|
fn files(&mut self, build: &mut impl build::FilesHandle) {
|
|
build.add_inputs("tsc", inputs![":node_modules:tsc"]);
|
|
build.add_inputs("in", &self.ts_files);
|
|
build.add_inputs("", inputs!["yarn.lock"]);
|
|
|
|
let ts_files = build.expand_inputs(&self.ts_files);
|
|
let output_files: Vec<_> = ts_files
|
|
.iter()
|
|
.flat_map(|f| {
|
|
let js_path = f.replace(".ts", ".js");
|
|
let dts_path = f.replace(".ts", ".d.ts");
|
|
[
|
|
(self.out_path_transform)(&js_path),
|
|
(self.out_path_transform)(&dts_path),
|
|
]
|
|
})
|
|
.collect();
|
|
|
|
build.create_dir_all("out_dir", self.out_dir);
|
|
build.add_outputs("", output_files);
|
|
}
|
|
}
|