Compare commits

...

11 commits

Author SHA1 Message Date
Alex Good
8af437d335
fix the e2e 2022-09-26 17:51:05 -05:00
Orion Henry
85f5016193
fmt & clippy 2022-09-26 17:46:54 -05:00
Orion Henry
1b26647f66
update all tests 2022-09-26 17:46:53 -05:00
Orion Henry
bab4501671
implement increment 2022-09-26 17:46:53 -05:00
Orion Henry
6142c383b1
map and array insert, delete for apply() 2022-09-26 17:46:53 -05:00
Orion Henry
6cbf2737c0
move op observer into transaction 2022-09-26 17:46:50 -05:00
Alex Good
b946e8c1a6
Set optimization levels to 'Z' for release profile
This reduces the size of the WASM bundle which is generated to around
800kb. Unfortunately wasm-pack doesn't allow us to use arbitrary
profiles when building and the optimization level has to be set at the
workspace root - consequently this flag is set for all packages in the
workspace. This shouldn't be an issue really as all our dependents in
the Rust world will be setting their own optimization flags anyway.
2022-09-26 17:40:28 -05:00
Alex Good
8c0bb30530
Add an e2e testing tool for the JS packaging
JS packaging is complicated and testing it manually is irritating. Add a
tool in `automerge-js/e2e` which stands up a local NPM registry and
publishes the various packages to that registry for use in automated and
manual tests. Update the test script in `scripts/ci/js_tests` to run the
tests using this tool
2022-09-26 17:40:28 -05:00
Alex Good
ce250a7bcd
Add examples of using automerge with bundlers 2022-09-26 17:40:27 -05:00
Alex Good
7a6a33da98
Rename automerge-js to automerge
Now that automerge-js is ready to go we rename it to `automerge-js` and
set the version to `1.0.1-preview.8` - which is one preview after the
currently published preview of the `automerge` package.
2022-09-26 17:40:27 -05:00
Alex Good
748dc954be
Remove async automerge-wasm wrapper
By moving to wasm-bindgens `bundler` target rather than using the `web`
target we remove the need for an async initialization step on the
automerge-wasm package. This means that the automerge-js package can now
depend directly on automerge-wasm and perform initialization itself,
thus making automerge-js a drop in replacement for the `automerge` JS
package (hopefully).

We bump the versions of automerge-wasm and automerge-types.
2022-09-26 17:40:22 -05:00
63 changed files with 4164 additions and 780 deletions

View file

@ -11,7 +11,11 @@ resolver = "2"
[profile.release] [profile.release]
debug = true debug = true
lto = true lto = true
opt-level = 3 opt-level = 'z'
[profile.bench] [profile.bench]
debug = true debug = true
[profile.release.package.automerge-wasm]
debug = false
opt-level = 'z'

View file

@ -170,7 +170,7 @@ pub unsafe extern "C" fn AMcommit(
if let Some(time) = time.as_ref() { if let Some(time) = time.as_ref() {
options.set_time(*time); options.set_time(*time);
} }
to_result(doc.commit_with::<()>(options)) to_result(doc.commit_with(options))
} }
/// \memberof AMdoc /// \memberof AMdoc

3
automerge-js/e2e/.gitignore vendored Normal file
View file

@ -0,0 +1,3 @@
node_modules/
verdacciodb/
htpasswd

View file

@ -0,0 +1,71 @@
#End to end testing for javascript packaging
The network of packages and bundlers we rely on to get the `automerge` package
working is a little complex. We have the `automerge-wasm` package, which the
`automerge` package depends upon, which means that anyone who depends on
`automerge` needs to either a) be using node or b) use a bundler in order to
load the underlying WASM module which is packaged in `automerge-wasm`.
The various bundlers involved are complicated and capricious and so we need an
easy way of testing that everything is in fact working as expected. To do this
we run a custom NPM registry (namely [Verdaccio](https://verdaccio.org/)) and
build the `automerge-wasm` and `automerge` packages and publish them to this
registry. Once we have this registry running we are able to build the example
projects which depend on these packages and check that everything works as
expected.
## Usage
First, install everything:
```
yarn install
```
### Build `automerge-js`
This builds the `automerge-wasm` package and then runs `yarn build` in the
`automerge-js` project with the `--registry` set to the verdaccio registry. The
end result is that you can run `yarn test` in the resulting `automerge-js`
directory in order to run tests against the current `automerge-wasm`.
```
yarn e2e buildjs
```
### Build examples
This either builds or the examples in `automerge-js/examples` or just a subset
of them. Once this is complete you can run the relevant scripts (e.g. `vite dev`
for the Vite example) to check everything works.
```
yarn e2e buildexamples
```
Or, to just build the webpack example
```
yarn e2e buildexamples -e webpack
```
### Run Registry
If you're experimenting with a project which is not in the `examples` folder
you'll need a running registry. `run-registry` builds and publishes
`automerge-js` and `automerge-wasm` and then runs the registry at
`localhost:4873`.
```
yarn e2e run-registry
```
You can now run `yarn install --registry http://localhost:4873` to experiment
with the built packages.
## Using the `dev` build of `automerge-wasm`
All the commands above take a `-p` flag which can be either `release` or
`debug`. The `debug` builds with additional debug symbols which makes errors
less cryptic.

409
automerge-js/e2e/index.ts Normal file
View file

@ -0,0 +1,409 @@
import {once} from "events"
import {setTimeout} from "timers/promises"
import {spawn, ChildProcess} from "child_process"
import * as child_process from "child_process"
import {command, subcommands, run, array, multioption, option, Type} from "cmd-ts"
import * as path from "path"
import * as fsPromises from "fs/promises"
import fetch from "node-fetch"
const VERDACCIO_DB_PATH = path.normalize(`${__dirname}/verdacciodb`)
const VERDACCIO_CONFIG_PATH = path.normalize(`${__dirname}/verdaccio.yaml`)
const AUTOMERGE_WASM_PATH = path.normalize(`${__dirname}/../../automerge-wasm`)
const AUTOMERGE_JS_PATH = path.normalize(`${__dirname}/..`)
const EXAMPLES_DIR = path.normalize(path.join(__dirname, "../", "examples"))
// The different example projects in "../examples"
type Example = "webpack" | "vite"
// Type to parse strings to `Example` so the types line up for the `buildExamples` commmand
const ReadExample: Type<string, Example> = {
async from(str) {
if (str === "webpack") {
return "webpack"
} else if (str === "vite") {
return "vite"
} else {
throw new Error(`Unknown example type ${str}`)
}
}
}
type Profile = "dev" | "release"
const ReadProfile: Type<string, Profile> = {
async from(str) {
if (str === "dev") {
return "dev"
} else if (str === "release") {
return "release"
} else {
throw new Error(`Unknown profile ${str}`)
}
}
}
const buildjs = command({
name: "buildjs",
args: {
profile: option({
type: ReadProfile,
long: "profile",
short: "p",
defaultValue: () => "dev" as Profile
})
},
handler: ({profile}) => {
console.log("building js")
withPublishedWasm(profile, async (registryUrl: string) => {
await buildAndPublishAutomergeJs(registryUrl)
})
}
})
const buildWasm = command({
name: "buildwasm",
args: {
profile: option({
type: ReadProfile,
long: "profile",
short: "p",
defaultValue: () => "dev" as Profile
})
},
handler: ({profile}) => {
console.log("building automerge-wasm")
withRegistry(
publishAutomergeTypes,
buildAutomergeWasm(profile),
)
}
})
const buildexamples = command({
name: "buildexamples",
args: {
examples: multioption({
long: "example",
short: "e",
type: array(ReadExample),
}),
profile: option({
type: ReadProfile,
long: "profile",
short: "p",
defaultValue: () => "dev" as Profile
})
},
handler: ({examples, profile}) => {
if (examples.length === 0) {
examples = ["webpack", "vite"]
}
buildExamples(examples, profile)
}
})
const runRegistry = command({
name: "run-registry",
args: {
profile: option({
type: ReadProfile,
long: "profile",
short: "p",
defaultValue: () => "dev" as Profile
})
},
handler: ({profile}) => {
withPublishedWasm(profile, async (registryUrl: string) => {
await buildAndPublishAutomergeJs(registryUrl)
console.log("\n************************")
console.log(` Verdaccio NPM registry is running at ${registryUrl}`)
console.log(" press CTRL-C to exit ")
console.log("************************")
await once(process, "SIGINT")
}).catch(e => {
console.error(`Failed: ${e}`)
})
}
})
const app = subcommands({
name: "e2e",
cmds: {buildjs, buildexamples, buildwasm: buildWasm, "run-registry": runRegistry}
})
run(app, process.argv.slice(2))
async function buildExamples(examples: Array<Example>, profile: Profile) {
withPublishedWasm(profile, async (registryUrl) => {
printHeader("building and publishing automerge")
await buildAndPublishAutomergeJs(registryUrl)
for (const example of examples) {
printHeader(`building ${example} example`)
if (example === "webpack") {
const projectPath = path.join(EXAMPLES_DIR, example)
await removeExistingAutomerge(projectPath)
await fsPromises.rm(path.join(projectPath, "yarn.lock"), {force: true})
await spawnAndWait("yarn", ["--cwd", projectPath, "install", "--registry", registryUrl, "--check-files"], {stdio: "inherit"})
await spawnAndWait("yarn", ["--cwd", projectPath, "build"], {stdio: "inherit"})
} else if (example === "vite") {
const projectPath = path.join(EXAMPLES_DIR, example)
await removeExistingAutomerge(projectPath)
await fsPromises.rm(path.join(projectPath, "yarn.lock"), {force: true})
await spawnAndWait("yarn", ["--cwd", projectPath, "install", "--registry", registryUrl, "--check-files"], {stdio: "inherit"})
await spawnAndWait("yarn", ["--cwd", projectPath, "build"], {stdio: "inherit"})
}
}
})
}
type WithRegistryAction = (registryUrl: string) => Promise<void>
async function withRegistry(action: WithRegistryAction, ...actions: Array<WithRegistryAction>) {
// First, start verdaccio
printHeader("Starting verdaccio NPM server")
const verd = await VerdaccioProcess.start()
actions.unshift(action)
for (const action of actions) {
try {
type Step = "verd-died" | "action-completed"
const verdDied: () => Promise<Step> = async () => {
await verd.died()
return "verd-died"
}
const actionComplete: () => Promise<Step> = async () => {
await action("http://localhost:4873")
return "action-completed"
}
const result = await Promise.race([verdDied(), actionComplete()])
if (result === "verd-died") {
throw new Error("verdaccio unexpectedly exited")
}
} catch(e) {
await verd.kill()
throw e
}
}
await verd.kill()
}
async function withPublishedWasm(profile: Profile, action: WithRegistryAction) {
withRegistry(
publishAutomergeTypes,
buildAutomergeWasm(profile),
publishAutomergeWasm,
action
)
}
async function publishAutomergeTypes(registryUrl: string) {
// Publish automerge-types
printHeader("Publishing automerge-types package to verdaccio")
await fsPromises.rm(path.join(VERDACCIO_DB_PATH, "automerge-types"), { recursive: true, force: true} )
await yarnPublish(registryUrl, path.join(AUTOMERGE_WASM_PATH, "types"))
}
function buildAutomergeWasm(profile: Profile): WithRegistryAction {
return async (registryUrl: string) => {
printHeader("building automerge-wasm")
await spawnAndWait("yarn", ["--cwd", AUTOMERGE_WASM_PATH, "--registry", registryUrl, "install"], {stdio: "inherit"})
const cmd = profile === "release" ? "release" : "debug"
await spawnAndWait("yarn", ["--cwd", AUTOMERGE_WASM_PATH, cmd], {stdio: "inherit"})
}
}
async function publishAutomergeWasm(registryUrl: string) {
printHeader("Publishing automerge-wasm to verdaccio")
await fsPromises.rm(path.join(VERDACCIO_DB_PATH, "automerge-wasm"), { recursive: true, force: true} )
await yarnPublish(registryUrl, AUTOMERGE_WASM_PATH)
}
async function buildAndPublishAutomergeJs(registryUrl: string) {
// Build the js package
printHeader("Building automerge")
await removeExistingAutomerge(AUTOMERGE_JS_PATH)
await removeFromVerdaccio("automerge")
await fsPromises.rm(path.join(AUTOMERGE_JS_PATH, "yarn.lock"), {force: true})
await spawnAndWait("yarn", ["--cwd", AUTOMERGE_JS_PATH, "install", "--registry", registryUrl, "--check-files"], {stdio: "inherit"})
await spawnAndWait("yarn", ["--cwd", AUTOMERGE_JS_PATH, "build"], {stdio: "inherit"})
await yarnPublish(registryUrl, AUTOMERGE_JS_PATH)
}
/**
* A running verdaccio process
*
*/
class VerdaccioProcess {
child: ChildProcess
stdout: Array<Buffer>
stderr: Array<Buffer>
constructor(child: ChildProcess) {
this.child = child
// Collect stdout/stderr otherwise the subprocess gets blocked writing
this.stdout = []
this.stderr = []
this.child.on("data", (data) => this.stdout.push(data))
this.child.on("data", (data) => this.stderr.push(data))
const errCallback = (e: any) => {
console.error("!!!!!!!!!ERROR IN VERDACCIO PROCESS!!!!!!!!!")
console.error(" ", e)
if (this.stdout.length > 0) {
console.log("\n**Verdaccio stdout**")
const stdout = Buffer.concat(this.stdout)
process.stdout.write(stdout)
}
if (this.stderr.length > 0) {
console.log("\n**Verdaccio stderr**")
const stdout = Buffer.concat(this.stderr)
process.stdout.write(stdout)
}
process.exit(-1)
}
this.child.on("error", errCallback)
}
/**
* Spawn a verdaccio process and wait for it to respond succesfully to http requests
*
* The returned `VerdaccioProcess` can be used to control the subprocess
*/
static async start() {
const child = spawn("yarn", ["verdaccio", "--config", VERDACCIO_CONFIG_PATH], {env: { FORCE_COLOR: "true"}})
// Forward stdout and stderr whilst waiting for startup to complete
const stdoutCallback = (data: Buffer) => process.stdout.write(data)
const stderrCallback = (data: Buffer) => process.stderr.write(data)
child.stdout.on("data", stdoutCallback)
child.stderr.on("data", stderrCallback)
const errored = once(child, "error")
const healthCheck = async () => {
while (true) {
try {
const resp = await fetch("http://localhost:4873")
if (resp.status === 200) {
return
} else {
console.log(`Healthcheck failed: bad status ${resp.status}`)
}
} catch (e) {
console.error(`Healthcheck failed: ${e}`)
}
await setTimeout(500)
}
}
await Promise.race([healthCheck(), errored])
// Stop forwarding stdout/stderr
child.stdout.off("data", stdoutCallback)
child.stderr.off("data", stderrCallback)
return new VerdaccioProcess(child)
}
/**
* Send a SIGKILL to the process and wait for it to stop
*/
async kill() {
this.child.kill();
const errored = once(this.child, "error")
const finished = once(this.child, "close")
await Promise.race([errored, finished])
}
/**
* A promise which resolves if the subprocess exits for some reason
*/
async died(): Promise<number | null> {
const [exit, _signal] = await once(this.child, "exit")
return exit
}
}
function printHeader(header: string) {
console.log("\n===============================")
console.log(` ${header}`)
console.log("===============================")
}
/**
* Removes the automerge, automerge-wasm, and automerge-js packages from
* `$packageDir/node_modules`
*
* This is useful to force refreshing a package by use in combination with
* `yarn install --check-files`, which checks if a package is present in
* `node_modules` and if it is not forces a reinstall.
*
* @param packageDir - The directory containing the package.json of the target project
*/
async function removeExistingAutomerge(packageDir: string) {
await fsPromises.rm(path.join(packageDir, "node_modules", "automerge-wasm"), {recursive: true, force: true})
await fsPromises.rm(path.join(packageDir, "node_modules", "automerge-types"), {recursive: true, force: true})
await fsPromises.rm(path.join(packageDir, "node_modules", "automerge"), {recursive: true, force: true})
}
type SpawnResult = {
stdout?: Buffer,
stderr?: Buffer,
}
async function spawnAndWait(cmd: string, args: Array<string>, options: child_process.SpawnOptions): Promise<SpawnResult> {
const child = spawn(cmd, args, options)
let stdout = null
let stderr = null
if (child.stdout) {
stdout = []
child.stdout.on("data", data => stdout.push(data))
}
if (child.stderr) {
stderr = []
child.stderr.on("data", data => stderr.push(data))
}
const [exit, _signal] = await once(child, "exit")
if (exit && exit !== 0) {
throw new Error("nonzero exit code")
}
return {
stderr: stderr? Buffer.concat(stderr) : null,
stdout: stdout ? Buffer.concat(stdout) : null
}
}
/**
* Remove a package from the verdaccio registry. This is necessary because we
* often want to _replace_ a version rather than update the version number.
* Obviously this is very bad and verboten in normal circumastances, but the
* whole point here is to be able to test the entire packaging story so it's
* okay I Promise.
*/
async function removeFromVerdaccio(packageName: string) {
await fsPromises.rm(path.join(VERDACCIO_DB_PATH, packageName), {force: true, recursive: true})
}
async function yarnPublish(registryUrl: string, cwd: string) {
await spawnAndWait(
"yarn",
[
"--registry",
registryUrl,
"--cwd",
cwd,
"publish",
"--non-interactive",
],
{
stdio: "inherit",
env: {
FORCE_COLOR: "true",
// This is a fake token, it just has to be the right format
npm_config__auth: "//localhost:4873/:_authToken=Gp2Mgxm4faa/7wp0dMSuRA=="
}
})
}

View file

@ -0,0 +1,23 @@
{
"name": "e2e",
"version": "0.0.1",
"description": "",
"main": "index.js",
"scripts": {
"e2e": "ts-node index.ts"
},
"author": "",
"license": "ISC",
"dependencies": {
"@types/node": "^18.7.18",
"cmd-ts": "^0.11.0",
"node-fetch": "^2",
"ts-node": "^10.9.1",
"typed-emitter": "^2.1.0",
"typescript": "^4.8.3",
"verdaccio": "5"
},
"devDependencies": {
"@types/node-fetch": "2.x"
}
}

View file

@ -0,0 +1,6 @@
{
"compilerOptions": {
"types": ["node"]
},
"module": "nodenext"
}

View file

@ -0,0 +1,28 @@
storage: "./verdacciodb"
auth:
htpasswd:
file: ./htpasswd
publish:
allow_offline: true
logs: {type: stdout, format: pretty, level: info}
packages:
"automerge-types":
access: "$all"
publish: "$all"
"automerge-wasm":
access: "$all"
publish: "$all"
"automerge-js":
access: "$all"
publish: "$all"
"*":
access: "$all"
publish: "$all"
proxy: npmjs
"@*/*":
access: "$all"
publish: "$all"
proxy: npmjs
uplinks:
npmjs:
url: https://registry.npmjs.org/

2130
automerge-js/e2e/yarn.lock Normal file

File diff suppressed because it is too large Load diff

2
automerge-js/examples/vite/.gitignore vendored Normal file
View file

@ -0,0 +1,2 @@
node_modules/
yarn.lock

View file

@ -0,0 +1,13 @@
<!DOCTYPE html>
<html lang="en">
<head>
<meta charset="UTF-8" />
<link rel="icon" type="image/svg+xml" href="/vite.svg" />
<meta name="viewport" content="width=device-width, initial-scale=1.0" />
<title>Vite + TS</title>
</head>
<body>
<div id="app"></div>
<script type="module" src="/src/main.ts"></script>
</body>
</html>

View file

@ -0,0 +1,15 @@
import * as Automerge from "/node_modules/.vite/deps/automerge-js.js?v=6e973f28";
console.log(Automerge);
let doc = Automerge.init();
doc = Automerge.change(doc, (d) => d.hello = "from automerge-js");
console.log(doc);
const result = JSON.stringify(doc);
if (typeof document !== "undefined") {
const element = document.createElement("div");
element.innerHTML = JSON.stringify(result);
document.body.appendChild(element);
} else {
console.log("node:", result);
}
//# sourceMappingURL=data:application/json;base64,eyJ2ZXJzaW9uIjozLCJzb3VyY2VzIjpbIi9ob21lL2FsZXgvUHJvamVjdHMvYXV0b21lcmdlL2F1dG9tZXJnZS1ycy9hdXRvbWVyZ2UtanMvZXhhbXBsZXMvdml0ZS9zcmMvbWFpbi50cyJdLCJzb3VyY2VzQ29udGVudCI6WyJpbXBvcnQgKiBhcyBBdXRvbWVyZ2UgZnJvbSBcImF1dG9tZXJnZS1qc1wiXG5cbi8vIGhlbGxvIHdvcmxkIGNvZGUgdGhhdCB3aWxsIHJ1biBjb3JyZWN0bHkgb24gd2ViIG9yIG5vZGVcblxuY29uc29sZS5sb2coQXV0b21lcmdlKVxubGV0IGRvYyA9IEF1dG9tZXJnZS5pbml0KClcbmRvYyA9IEF1dG9tZXJnZS5jaGFuZ2UoZG9jLCAoZDogYW55KSA9PiBkLmhlbGxvID0gXCJmcm9tIGF1dG9tZXJnZS1qc1wiKVxuY29uc29sZS5sb2coZG9jKVxuY29uc3QgcmVzdWx0ID0gSlNPTi5zdHJpbmdpZnkoZG9jKVxuXG5pZiAodHlwZW9mIGRvY3VtZW50ICE9PSAndW5kZWZpbmVkJykge1xuICAgIC8vIGJyb3dzZXJcbiAgICBjb25zdCBlbGVtZW50ID0gZG9jdW1lbnQuY3JlYXRlRWxlbWVudCgnZGl2Jyk7XG4gICAgZWxlbWVudC5pbm5lckhUTUwgPSBKU09OLnN0cmluZ2lmeShyZXN1bHQpXG4gICAgZG9jdW1lbnQuYm9keS5hcHBlbmRDaGlsZChlbGVtZW50KTtcbn0gZWxzZSB7XG4gICAgLy8gc2VydmVyXG4gICAgY29uc29sZS5sb2coXCJub2RlOlwiLCByZXN1bHQpXG59XG5cbiJdLCJtYXBwaW5ncyI6IkFBQUEsWUFBWSxlQUFlO0FBSTNCLFFBQVEsSUFBSSxTQUFTO0FBQ3JCLElBQUksTUFBTSxVQUFVLEtBQUs7QUFDekIsTUFBTSxVQUFVLE9BQU8sS0FBSyxDQUFDLE1BQVcsRUFBRSxRQUFRLG1CQUFtQjtBQUNyRSxRQUFRLElBQUksR0FBRztBQUNmLE1BQU0sU0FBUyxLQUFLLFVBQVUsR0FBRztBQUVqQyxJQUFJLE9BQU8sYUFBYSxhQUFhO0FBRWpDLFFBQU0sVUFBVSxTQUFTLGNBQWMsS0FBSztBQUM1QyxVQUFRLFlBQVksS0FBSyxVQUFVLE1BQU07QUFDekMsV0FBUyxLQUFLLFlBQVksT0FBTztBQUNyQyxPQUFPO0FBRUgsVUFBUSxJQUFJLFNBQVMsTUFBTTtBQUMvQjsiLCJuYW1lcyI6W119

View file

@ -0,0 +1,20 @@
{
"name": "autovite",
"private": true,
"version": "0.0.0",
"type": "module",
"scripts": {
"dev": "vite",
"build": "tsc && vite build",
"preview": "vite preview"
},
"dependencies": {
"automerge": "1.0.1-preview.8"
},
"devDependencies": {
"typescript": "^4.6.4",
"vite": "^3.1.0",
"vite-plugin-top-level-await": "^1.1.1",
"vite-plugin-wasm": "^2.1.0"
}
}

View file

@ -0,0 +1 @@
<svg xmlns="http://www.w3.org/2000/svg" xmlns:xlink="http://www.w3.org/1999/xlink" aria-hidden="true" role="img" class="iconify iconify--logos" width="31.88" height="32" preserveAspectRatio="xMidYMid meet" viewBox="0 0 256 257"><defs><linearGradient id="IconifyId1813088fe1fbc01fb466" x1="-.828%" x2="57.636%" y1="7.652%" y2="78.411%"><stop offset="0%" stop-color="#41D1FF"></stop><stop offset="100%" stop-color="#BD34FE"></stop></linearGradient><linearGradient id="IconifyId1813088fe1fbc01fb467" x1="43.376%" x2="50.316%" y1="2.242%" y2="89.03%"><stop offset="0%" stop-color="#FFEA83"></stop><stop offset="8.333%" stop-color="#FFDD35"></stop><stop offset="100%" stop-color="#FFA800"></stop></linearGradient></defs><path fill="url(#IconifyId1813088fe1fbc01fb466)" d="M255.153 37.938L134.897 252.976c-2.483 4.44-8.862 4.466-11.382.048L.875 37.958c-2.746-4.814 1.371-10.646 6.827-9.67l120.385 21.517a6.537 6.537 0 0 0 2.322-.004l117.867-21.483c5.438-.991 9.574 4.796 6.877 9.62Z"></path><path fill="url(#IconifyId1813088fe1fbc01fb467)" d="M185.432.063L96.44 17.501a3.268 3.268 0 0 0-2.634 3.014l-5.474 92.456a3.268 3.268 0 0 0 3.997 3.378l24.777-5.718c2.318-.535 4.413 1.507 3.936 3.838l-7.361 36.047c-.495 2.426 1.782 4.5 4.151 3.78l15.304-4.649c2.372-.72 4.652 1.36 4.15 3.788l-11.698 56.621c-.732 3.542 3.979 5.473 5.943 2.437l1.313-2.028l72.516-144.72c1.215-2.423-.88-5.186-3.54-4.672l-25.505 4.922c-2.396.462-4.435-1.77-3.759-4.114l16.646-57.705c.677-2.35-1.37-4.583-3.769-4.113Z"></path></svg>

After

Width:  |  Height:  |  Size: 1.5 KiB

View file

@ -0,0 +1,9 @@
export function setupCounter(element: HTMLButtonElement) {
let counter = 0
const setCounter = (count: number) => {
counter = count
element.innerHTML = `count is ${counter}`
}
element.addEventListener('click', () => setCounter(++counter))
setCounter(0)
}

View file

@ -0,0 +1,18 @@
import * as Automerge from "automerge"
// hello world code that will run correctly on web or node
let doc = Automerge.init()
doc = Automerge.change(doc, (d: any) => d.hello = "from automerge-js")
const result = JSON.stringify(doc)
if (typeof document !== 'undefined') {
// browser
const element = document.createElement('div');
element.innerHTML = JSON.stringify(result)
document.body.appendChild(element);
} else {
// server
console.log("node:", result)
}

View file

@ -0,0 +1,97 @@
:root {
font-family: Inter, Avenir, Helvetica, Arial, sans-serif;
font-size: 16px;
line-height: 24px;
font-weight: 400;
color-scheme: light dark;
color: rgba(255, 255, 255, 0.87);
background-color: #242424;
font-synthesis: none;
text-rendering: optimizeLegibility;
-webkit-font-smoothing: antialiased;
-moz-osx-font-smoothing: grayscale;
-webkit-text-size-adjust: 100%;
}
a {
font-weight: 500;
color: #646cff;
text-decoration: inherit;
}
a:hover {
color: #535bf2;
}
body {
margin: 0;
display: flex;
place-items: center;
min-width: 320px;
min-height: 100vh;
}
h1 {
font-size: 3.2em;
line-height: 1.1;
}
#app {
max-width: 1280px;
margin: 0 auto;
padding: 2rem;
text-align: center;
}
.logo {
height: 6em;
padding: 1.5em;
will-change: filter;
}
.logo:hover {
filter: drop-shadow(0 0 2em #646cffaa);
}
.logo.vanilla:hover {
filter: drop-shadow(0 0 2em #3178c6aa);
}
.card {
padding: 2em;
}
.read-the-docs {
color: #888;
}
button {
border-radius: 8px;
border: 1px solid transparent;
padding: 0.6em 1.2em;
font-size: 1em;
font-weight: 500;
font-family: inherit;
background-color: #1a1a1a;
cursor: pointer;
transition: border-color 0.25s;
}
button:hover {
border-color: #646cff;
}
button:focus,
button:focus-visible {
outline: 4px auto -webkit-focus-ring-color;
}
@media (prefers-color-scheme: light) {
:root {
color: #213547;
background-color: #ffffff;
}
a:hover {
color: #747bff;
}
button {
background-color: #f9f9f9;
}
}

View file

@ -0,0 +1 @@
<svg xmlns="http://www.w3.org/2000/svg" xmlns:xlink="http://www.w3.org/1999/xlink" aria-hidden="true" role="img" class="iconify iconify--logos" width="32" height="32" preserveAspectRatio="xMidYMid meet" viewBox="0 0 256 256"><path fill="#007ACC" d="M0 128v128h256V0H0z"></path><path fill="#FFF" d="m56.612 128.85l-.081 10.483h33.32v94.68h23.568v-94.68h33.321v-10.28c0-5.69-.122-10.444-.284-10.566c-.122-.162-20.4-.244-44.983-.203l-44.74.122l-.121 10.443Zm149.955-10.742c6.501 1.625 11.459 4.51 16.01 9.224c2.357 2.52 5.851 7.111 6.136 8.208c.08.325-11.053 7.802-17.798 11.988c-.244.162-1.22-.894-2.317-2.52c-3.291-4.795-6.745-6.867-12.028-7.233c-7.76-.528-12.759 3.535-12.718 10.321c0 1.992.284 3.17 1.097 4.795c1.707 3.536 4.876 5.649 14.832 9.956c18.326 7.883 26.168 13.084 31.045 20.48c5.445 8.249 6.664 21.415 2.966 31.208c-4.063 10.646-14.14 17.879-28.323 20.276c-4.388.772-14.79.65-19.504-.203c-10.28-1.828-20.033-6.908-26.047-13.572c-2.357-2.6-6.949-9.387-6.664-9.874c.122-.163 1.178-.813 2.356-1.504c1.138-.65 5.446-3.129 9.509-5.485l7.355-4.267l1.544 2.276c2.154 3.29 6.867 7.801 9.712 9.305c8.167 4.307 19.383 3.698 24.909-1.26c2.357-2.153 3.332-4.388 3.332-7.68c0-2.966-.366-4.266-1.91-6.501c-1.99-2.845-6.054-5.242-17.595-10.24c-13.206-5.69-18.895-9.224-24.096-14.832c-3.007-3.25-5.852-8.452-7.03-12.8c-.975-3.617-1.22-12.678-.447-16.335c2.723-12.76 12.353-21.659 26.25-24.3c4.51-.853 14.994-.528 19.424.569Z"></path></svg>

After

Width:  |  Height:  |  Size: 1.4 KiB

View file

@ -0,0 +1 @@
/// <reference types="vite/client" />

View file

@ -0,0 +1,20 @@
{
"compilerOptions": {
"target": "ESNext",
"useDefineForClassFields": true,
"module": "ESNext",
"lib": ["ESNext", "DOM"],
"moduleResolution": "Node",
"strict": true,
"sourceMap": true,
"resolveJsonModule": true,
"isolatedModules": true,
"esModuleInterop": true,
"noEmit": true,
"noUnusedLocals": true,
"noUnusedParameters": true,
"noImplicitReturns": true,
"skipLibCheck": true
},
"include": ["src"]
}

View file

@ -0,0 +1,15 @@
import { defineConfig } from "vite"
import wasm from "vite-plugin-wasm"
import topLevelAwait from "vite-plugin-top-level-await"
export default defineConfig({
plugins: [topLevelAwait(), wasm()],
optimizeDeps: {
// This is necessary because otherwise `vite dev` includes two separate
// versions of the JS wrapper. This causes problems because the JS
// wrapper has a module level variable to track JS side heap
// allocations, initializing this twice causes horrible breakage
exclude: ["automerge-wasm"]
}
})

View file

@ -10,13 +10,13 @@
}, },
"author": "", "author": "",
"dependencies": { "dependencies": {
"automerge-js": "file:automerge-js-0.1.0.tgz", "automerge": "1.0.1-preview.8"
"automerge-wasm": "file:automerge-wasm-0.1.3.tgz"
}, },
"devDependencies": { "devDependencies": {
"serve": "^13.0.2", "serve": "^13.0.2",
"webpack": "^5.72.1", "webpack": "^5.72.1",
"webpack-cli": "^4.9.2", "webpack-cli": "^4.9.2",
"webpack-dev-server": "^4.11.1",
"webpack-node-externals": "^3.0.0" "webpack-node-externals": "^3.0.0"
} }
} }

View file

@ -1,22 +1,18 @@
import * as Automerge from "automerge-js" import * as Automerge from "automerge"
import init from "automerge-wasm"
// hello world code that will run correctly on web or node // hello world code that will run correctly on web or node
init().then((api) => { let doc = Automerge.init()
Automerge.use(api) doc = Automerge.change(doc, (d) => d.hello = "from automerge-js")
let doc = Automerge.init() const result = JSON.stringify(doc)
doc = Automerge.change(doc, (d) => d.hello = "from automerge-js")
const result = JSON.stringify(doc)
if (typeof document !== 'undefined') { if (typeof document !== 'undefined') {
// browser // browser
const element = document.createElement('div'); const element = document.createElement('div');
element.innerHTML = JSON.stringify(result) element.innerHTML = JSON.stringify(result)
document.body.appendChild(element); document.body.appendChild(element);
} else { } else {
// server // server
console.log("node:", result) console.log("node:", result)
} }
})

View file

@ -18,6 +18,7 @@ const serverConfig = {
}; };
const clientConfig = { const clientConfig = {
experiments: { asyncWebAssembly: true },
target: 'web', target: 'web',
entry: './src/index.js', entry: './src/index.js',
output: { output: {

View file

@ -1,10 +1,10 @@
{ {
"name": "automerge-js", "name": "automerge",
"collaborators": [ "collaborators": [
"Orion Henry <orion@inkandswitch.com>", "Orion Henry <orion@inkandswitch.com>",
"Martin Kleppmann" "Martin Kleppmann"
], ],
"version": "0.1.12", "version": "1.0.1-preview.8",
"description": "Reimplementation of `automerge` on top of the automerge-wasm backend", "description": "Reimplementation of `automerge` on top of the automerge-wasm backend",
"homepage": "https://github.com/automerge/automerge-rs/tree/main/automerge-js", "homepage": "https://github.com/automerge/automerge-rs/tree/main/automerge-js",
"repository": "github:automerge/automerge-rs", "repository": "github:automerge/automerge-rs",
@ -47,7 +47,6 @@
"@types/uuid": "^8.3.4", "@types/uuid": "^8.3.4",
"@typescript-eslint/eslint-plugin": "^5.25.0", "@typescript-eslint/eslint-plugin": "^5.25.0",
"@typescript-eslint/parser": "^5.25.0", "@typescript-eslint/parser": "^5.25.0",
"automerge-wasm": "^0.1.6",
"eslint": "^8.15.0", "eslint": "^8.15.0",
"fast-sha256": "^1.3.0", "fast-sha256": "^1.3.0",
"mocha": "^10.0.0", "mocha": "^10.0.0",
@ -56,7 +55,8 @@
"typescript": "^4.6.4" "typescript": "^4.6.4"
}, },
"dependencies": { "dependencies": {
"automerge-types": "0.1.5", "automerge-types": "0.1.6",
"automerge-wasm": "0.1.7",
"uuid": "^8.3" "uuid": "^8.3"
} }
} }

View file

@ -24,10 +24,14 @@ export interface State<T> {
snapshot: T snapshot: T
} }
export function use(api: API) { export function use(api: API) {
UseApi(api) UseApi(api)
} }
import * as wasm from "automerge-wasm"
use(wasm)
export function getBackend<T>(doc: Doc<T>) : Automerge { export function getBackend<T>(doc: Doc<T>) : Automerge {
return _state(doc) return _state(doc)
} }
@ -87,7 +91,7 @@ export function free<T>(doc: Doc<T>) {
return _state(doc).free() return _state(doc).free()
} }
export function from<T>(initialState: T | Doc<T>, actor?: ActorId): Doc<T> { export function from<T extends {}>(initialState: T | Doc<T>, actor?: ActorId): Doc<T> {
return change(init(actor), (d) => Object.assign(d, initialState)) return change(init(actor), (d) => Object.assign(d, initialState))
} }

View file

@ -11,15 +11,15 @@ export function UseApi(api: API) {
/* eslint-disable */ /* eslint-disable */
export const ApiHandler : API = { export const ApiHandler : API = {
create(actor?: Actor): Automerge { throw new RangeError("Automerge.use() not called") }, create(actor?: Actor): Automerge { throw new RangeError("Automerge.use() not called") },
load(data: Uint8Array, actor?: Actor): Automerge { throw new RangeError("Automerge.use() not called") }, load(data: Uint8Array, actor?: Actor): Automerge { throw new RangeError("Automerge.use() not called (load)") },
encodeChange(change: DecodedChange): Change { throw new RangeError("Automerge.use() not called") }, encodeChange(change: DecodedChange): Change { throw new RangeError("Automerge.use() not called (encodeChange)") },
decodeChange(change: Change): DecodedChange { throw new RangeError("Automerge.use() not called") }, decodeChange(change: Change): DecodedChange { throw new RangeError("Automerge.use() not called (decodeChange)") },
initSyncState(): SyncState { throw new RangeError("Automerge.use() not called") }, initSyncState(): SyncState { throw new RangeError("Automerge.use() not called (initSyncState)") },
encodeSyncMessage(message: DecodedSyncMessage): SyncMessage { throw new RangeError("Automerge.use() not called") }, encodeSyncMessage(message: DecodedSyncMessage): SyncMessage { throw new RangeError("Automerge.use() not called (encodeSyncMessage)") },
decodeSyncMessage(msg: SyncMessage): DecodedSyncMessage { throw new RangeError("Automerge.use() not called") }, decodeSyncMessage(msg: SyncMessage): DecodedSyncMessage { throw new RangeError("Automerge.use() not called (decodeSyncMessage)") },
encodeSyncState(state: SyncState): Uint8Array { throw new RangeError("Automerge.use() not called") }, encodeSyncState(state: SyncState): Uint8Array { throw new RangeError("Automerge.use() not called (encodeSyncState)") },
decodeSyncState(data: Uint8Array): SyncState { throw new RangeError("Automerge.use() not called") }, decodeSyncState(data: Uint8Array): SyncState { throw new RangeError("Automerge.use() not called (decodeSyncState)") },
exportSyncState(state: SyncState): JsSyncState { throw new RangeError("Automerge.use() not called") }, exportSyncState(state: SyncState): JsSyncState { throw new RangeError("Automerge.use() not called (exportSyncState)") },
importSyncState(state: JsSyncState): SyncState { throw new RangeError("Automerge.use() not called") }, importSyncState(state: JsSyncState): SyncState { throw new RangeError("Automerge.use() not called (importSyncState)") },
} }
/* eslint-enable */ /* eslint-enable */

View file

@ -2,9 +2,6 @@ import * as tt from "automerge-types"
import * as assert from 'assert' import * as assert from 'assert'
import * as util from 'util' import * as util from 'util'
import * as Automerge from '../src' import * as Automerge from '../src'
import * as AutomergeWASM from "automerge-wasm"
Automerge.use(AutomergeWASM)
describe('Automerge', () => { describe('Automerge', () => {
describe('basics', () => { describe('basics', () => {
@ -175,4 +172,15 @@ describe('Automerge', () => {
console.log(doc.text.indexOf("world")) console.log(doc.text.indexOf("world"))
}) })
}) })
it('should obtain the same conflicts, regardless of merge order', () => {
let s1 = Automerge.init()
let s2 = Automerge.init()
s1 = Automerge.change(s1, doc => { doc.x = 1; doc.y = 2 })
s2 = Automerge.change(s2, doc => { doc.x = 3; doc.y = 4 })
const m1 = Automerge.merge(Automerge.clone(s1), Automerge.clone(s2))
const m2 = Automerge.merge(Automerge.clone(s2), Automerge.clone(s1))
assert.deepStrictEqual(Automerge.getConflicts(m1, 'x'), Automerge.getConflicts(m2, 'x'))
})
}) })

View file

@ -2,9 +2,6 @@ import * as assert from 'assert'
import { checkEncoded } from './helpers' import { checkEncoded } from './helpers'
import * as Automerge from '../src' import * as Automerge from '../src'
import { encodeChange, decodeChange } from '../src' import { encodeChange, decodeChange } from '../src'
import * as AutomergeWASM from "automerge-wasm"
Automerge.use(AutomergeWASM)
describe('change encoding', () => { describe('change encoding', () => {
it('should encode text edits', () => { it('should encode text edits', () => {

View file

@ -2,9 +2,6 @@ import * as assert from 'assert'
import * as Automerge from '../src' import * as Automerge from '../src'
import { assertEqualsOneOf } from './helpers' import { assertEqualsOneOf } from './helpers'
import { decodeChange } from './legacy/columnar' import { decodeChange } from './legacy/columnar'
import * as AutomergeWASM from "automerge-wasm"
Automerge.use(AutomergeWASM)
const UUID_PATTERN = /^[0-9a-f]{32}$/ const UUID_PATTERN = /^[0-9a-f]{32}$/
const OPID_PATTERN = /^[0-9]+@[0-9a-f]{32}$/ const OPID_PATTERN = /^[0-9]+@[0-9a-f]{32}$/

View file

@ -3,9 +3,6 @@ import * as Automerge from '../src'
import { BloomFilter } from './legacy/sync' import { BloomFilter } from './legacy/sync'
import { decodeChangeMeta } from './legacy/columnar' import { decodeChangeMeta } from './legacy/columnar'
import { decodeSyncMessage, encodeSyncMessage, decodeSyncState, encodeSyncState, initSyncState } from "../src" import { decodeSyncMessage, encodeSyncMessage, decodeSyncState, encodeSyncState, initSyncState } from "../src"
import * as AutomergeWASM from "automerge-wasm"
Automerge.use(AutomergeWASM)
function inspect(a) { function inspect(a) {
const util = require("util"); const util = require("util");

View file

@ -1,9 +1,6 @@
import * as assert from 'assert' import * as assert from 'assert'
import * as Automerge from '../src' import * as Automerge from '../src'
import { assertEqualsOneOf } from './helpers' import { assertEqualsOneOf } from './helpers'
import * as AutomergeWASM from "automerge-wasm"
Automerge.use(AutomergeWASM)
function attributeStateToAttributes(accumulatedAttributes) { function attributeStateToAttributes(accumulatedAttributes) {
const attributes = {} const attributes = {}

View file

@ -1,8 +1,5 @@
import * as assert from 'assert' import * as assert from 'assert'
import * as Automerge from '../src' import * as Automerge from '../src'
import * as AutomergeWASM from "automerge-wasm"
Automerge.use(AutomergeWASM)
const uuid = Automerge.uuid const uuid = Automerge.uuid

View file

@ -35,7 +35,7 @@ hex = "^0.4.3"
regex = "^1.5" regex = "^1.5"
[dependencies.wasm-bindgen] [dependencies.wasm-bindgen]
version = "^0.2" version = "^0.2.83"
#features = ["std"] #features = ["std"]
features = ["serde-serialize", "std"] features = ["serde-serialize", "std"]

View file

@ -1,2 +1,17 @@
import { Automerge as VanillaAutomerge } from "automerge-types"
export * from "automerge-types" export * from "automerge-types"
export { default } from "automerge-types" export { default } from "automerge-types"
export class Automerge extends VanillaAutomerge {
// experimental api can go here
applyPatches(obj: any): any;
// override old methods that return automerge
clone(actor?: string): Automerge;
fork(actor?: string): Automerge;
forkAt(heads: Heads, actor?: string): Automerge;
}
export function create(actor?: Actor): Automerge;
export function load(data: Uint8Array, actor?: Actor): Automerge;

View file

@ -1,5 +0,0 @@
let wasm = require("./bindgen")
module.exports = wasm
module.exports.load = module.exports.loadDoc
delete module.exports.loadDoc
module.exports.init = () => (new Promise((resolve,reject) => { resolve(module.exports) }))

View file

@ -8,29 +8,29 @@
"description": "wasm-bindgen bindings to the automerge rust implementation", "description": "wasm-bindgen bindings to the automerge rust implementation",
"homepage": "https://github.com/automerge/automerge-rs/tree/main/automerge-wasm", "homepage": "https://github.com/automerge/automerge-rs/tree/main/automerge-wasm",
"repository": "github:automerge/automerge-rs", "repository": "github:automerge/automerge-rs",
"version": "0.1.6", "version": "0.1.7",
"license": "MIT", "license": "MIT",
"files": [ "files": [
"README.md", "README.md",
"LICENSE", "LICENSE",
"package.json", "package.json",
"index.d.ts", "index.d.ts",
"nodejs/index.js",
"nodejs/bindgen.js", "nodejs/bindgen.js",
"nodejs/bindgen_bg.wasm", "nodejs/bindgen_bg.wasm",
"web/index.js", "bundler/bindgen.js",
"web/bindgen.js", "bundler/bindgen_bg.js",
"web/bindgen_bg.wasm" "bundler/bindgen_bg.wasm"
], ],
"types": "index.d.ts", "types": "index.d.ts",
"module": "./web/index.js", "module": "./bundler/bindgen.js",
"main": "./nodejs/index.js", "main": "./nodejs/bindgen.js",
"scripts": { "scripts": {
"lint": "eslint test/*.ts", "lint": "eslint test/*.ts",
"build": "cross-env PROFILE=dev TARGET=nodejs FEATURES='' yarn target", "build": "cross-env PROFILE=dev TARGET=nodejs FEATURES='' yarn target",
"debug": "cross-env PROFILE=dev yarn buildall",
"release": "cross-env PROFILE=release yarn buildall", "release": "cross-env PROFILE=release yarn buildall",
"buildall": "cross-env TARGET=nodejs yarn target && cross-env TARGET=web yarn target", "buildall": "cross-env TARGET=nodejs yarn target && cross-env TARGET=bundler yarn target",
"target": "rimraf ./$TARGET && wasm-pack build --target $TARGET --$PROFILE --out-name bindgen -d $TARGET -- $FEATURES && cp $TARGET-index.js $TARGET/index.js", "target": "rimraf ./$TARGET && wasm-pack build --target $TARGET --$PROFILE --out-name bindgen -d $TARGET -- $FEATURES",
"test": "ts-mocha -p tsconfig.json --type-check --bail --full-trace test/*.ts" "test": "ts-mocha -p tsconfig.json --type-check --bail --full-trace test/*.ts"
}, },
"devDependencies": { "devDependencies": {
@ -51,6 +51,10 @@
"typescript": "^4.6.4" "typescript": "^4.6.4"
}, },
"dependencies": { "dependencies": {
"automerge-types": "0.1.5" "automerge-types": "0.1.6"
},
"exports": {
"browser": "./bundler/bindgen.js",
"require": "./nodejs/bindgen.js"
} }
} }

View file

@ -1,13 +1,14 @@
use crate::AutoCommit;
use automerge as am; use automerge as am;
use automerge::transaction::Transactable; use automerge::transaction::Transactable;
use automerge::{Change, ChangeHash, Prop}; use automerge::{Change, ChangeHash, Prop};
use js_sys::{Array, Object, Reflect, Uint8Array}; use js_sys::{Array, Function, Object, Reflect, Uint8Array};
use std::collections::{BTreeSet, HashSet}; use std::collections::{BTreeSet, HashSet};
use std::fmt::Display; use std::fmt::Display;
use wasm_bindgen::prelude::*; use wasm_bindgen::prelude::*;
use wasm_bindgen::JsCast; use wasm_bindgen::JsCast;
use crate::{ObjId, ScalarValue, Value}; use crate::{observer::Patch, ObjId, ScalarValue, Value};
pub(crate) struct JS(pub(crate) JsValue); pub(crate) struct JS(pub(crate) JsValue);
pub(crate) struct AR(pub(crate) Array); pub(crate) struct AR(pub(crate) Array);
@ -354,7 +355,7 @@ pub(crate) fn get_heads(heads: Option<Array>) -> Option<Vec<ChangeHash>> {
heads.ok() heads.ok()
} }
pub(crate) fn map_to_js(doc: &am::AutoCommit, obj: &ObjId) -> JsValue { pub(crate) fn map_to_js(doc: &AutoCommit, obj: &ObjId) -> JsValue {
let keys = doc.keys(obj); let keys = doc.keys(obj);
let map = Object::new(); let map = Object::new();
for k in keys { for k in keys {
@ -380,7 +381,7 @@ pub(crate) fn map_to_js(doc: &am::AutoCommit, obj: &ObjId) -> JsValue {
map.into() map.into()
} }
pub(crate) fn map_to_js_at(doc: &am::AutoCommit, obj: &ObjId, heads: &[ChangeHash]) -> JsValue { pub(crate) fn map_to_js_at(doc: &AutoCommit, obj: &ObjId, heads: &[ChangeHash]) -> JsValue {
let keys = doc.keys(obj); let keys = doc.keys(obj);
let map = Object::new(); let map = Object::new();
for k in keys { for k in keys {
@ -406,7 +407,7 @@ pub(crate) fn map_to_js_at(doc: &am::AutoCommit, obj: &ObjId, heads: &[ChangeHas
map.into() map.into()
} }
pub(crate) fn list_to_js(doc: &am::AutoCommit, obj: &ObjId) -> JsValue { pub(crate) fn list_to_js(doc: &AutoCommit, obj: &ObjId) -> JsValue {
let len = doc.length(obj); let len = doc.length(obj);
let array = Array::new(); let array = Array::new();
for i in 0..len { for i in 0..len {
@ -432,7 +433,7 @@ pub(crate) fn list_to_js(doc: &am::AutoCommit, obj: &ObjId) -> JsValue {
array.into() array.into()
} }
pub(crate) fn list_to_js_at(doc: &am::AutoCommit, obj: &ObjId, heads: &[ChangeHash]) -> JsValue { pub(crate) fn list_to_js_at(doc: &AutoCommit, obj: &ObjId, heads: &[ChangeHash]) -> JsValue {
let len = doc.length(obj); let len = doc.length(obj);
let array = Array::new(); let array = Array::new();
for i in 0..len { for i in 0..len {
@ -457,3 +458,143 @@ pub(crate) fn list_to_js_at(doc: &am::AutoCommit, obj: &ObjId, heads: &[ChangeHa
} }
array.into() array.into()
} }
/*
pub(crate) fn export_values<'a, V: Iterator<Item = Value<'a>>>(val: V) -> Array {
val.map(|v| export_value(&v)).collect()
}
*/
pub(crate) fn export_value(val: &Value<'_>) -> JsValue {
match val {
Value::Object(o) if o == &am::ObjType::Map || o == &am::ObjType::Table => {
Object::new().into()
}
Value::Object(_) => Array::new().into(),
Value::Scalar(v) => ScalarValue(v.clone()).into(),
}
}
pub(crate) fn apply_patch(obj: JsValue, patch: &Patch) -> Result<JsValue, JsValue> {
apply_patch2(obj, patch, 0)
}
pub(crate) fn apply_patch2(obj: JsValue, patch: &Patch, depth: usize) -> Result<JsValue, JsValue> {
match (js_to_map_seq(&obj)?, patch.path().get(depth)) {
(JsObj::Map(o), Some(Prop::Map(key))) => {
let sub_obj = Reflect::get(&obj, &key.into())?;
let new_value = apply_patch2(sub_obj, patch, depth + 1)?;
let result =
Reflect::construct(&o.constructor(), &Array::new())?.dyn_into::<Object>()?;
let result = Object::assign(&result, &o).into();
Reflect::set(&result, &key.into(), &new_value)?;
Ok(result)
}
(JsObj::Seq(a), Some(Prop::Seq(index))) => {
let index = JsValue::from_f64(*index as f64);
let sub_obj = Reflect::get(&obj, &index)?;
let new_value = apply_patch2(sub_obj, patch, depth + 1)?;
let result = Reflect::construct(&a.constructor(), &a)?;
//web_sys::console::log_2(&format!("NEW VAL {}: ", tmpi).into(), &new_value);
Reflect::set(&result, &index, &new_value)?;
Ok(result)
}
(JsObj::Map(o), None) => {
let result =
Reflect::construct(&o.constructor(), &Array::new())?.dyn_into::<Object>()?;
let result = Object::assign(&result, &o);
match patch {
Patch::PutMap { key, value, .. } => {
let result = result.into();
Reflect::set(&result, &key.into(), &export_value(value))?;
Ok(result)
}
Patch::DeleteMap { key, .. } => {
Reflect::delete_property(&result, &key.into())?;
Ok(result.into())
}
Patch::Increment { prop, value, .. } => {
let result = result.into();
if let Prop::Map(key) = prop {
let key = key.into();
let old_val = Reflect::get(&o, &key)?;
if let Some(old) = old_val.as_f64() {
Reflect::set(&result, &key, &JsValue::from(old + *value as f64))?;
Ok(result)
} else {
Err(to_js_err("cant increment a non number value"))
}
} else {
Err(to_js_err("cant increment an index on a map"))
}
}
Patch::Insert { .. } => Err(to_js_err("cannot insert into map")),
Patch::DeleteSeq { .. } => Err(to_js_err("cannot splice a map")),
Patch::PutSeq { .. } => Err(to_js_err("cannot array index a map")),
}
}
(JsObj::Seq(a), None) => {
match patch {
Patch::PutSeq { index, value, .. } => {
let result = Reflect::construct(&a.constructor(), &a)?;
Reflect::set(&result, &(*index as f64).into(), &export_value(value))?;
Ok(result)
}
Patch::DeleteSeq { index, .. } => {
let result = &a.dyn_into::<Array>()?;
let mut f = |_, i, _| i != *index as u32;
let result = result.filter(&mut f);
Ok(result.into())
}
Patch::Insert { index, values, .. } => {
let from = Reflect::get(&a.constructor().into(), &"from".into())?
.dyn_into::<Function>()?;
let result = from.call1(&JsValue::undefined(), &a)?.dyn_into::<Array>()?;
// TODO: should be one function call
for (i, v) in values.iter().enumerate() {
result.splice(*index as u32 + i as u32, 0, &export_value(v));
}
Ok(result.into())
}
Patch::Increment { prop, value, .. } => {
let result = Reflect::construct(&a.constructor(), &a)?;
if let Prop::Seq(index) = prop {
let index = (*index as f64).into();
let old_val = Reflect::get(&a, &index)?;
if let Some(old) = old_val.as_f64() {
Reflect::set(&result, &index, &JsValue::from(old + *value as f64))?;
Ok(result)
} else {
Err(to_js_err("cant increment a non number value"))
}
} else {
Err(to_js_err("cant increment a key on a seq"))
}
}
Patch::DeleteMap { .. } => Err(to_js_err("cannot delete from a seq")),
Patch::PutMap { .. } => Err(to_js_err("cannot set key in seq")),
}
}
(_, _) => Err(to_js_err(format!(
"object/patch missmatch {:?} depth={:?}",
patch, depth
))),
}
}
#[derive(Debug)]
enum JsObj {
Map(Object),
Seq(Array),
}
fn js_to_map_seq(value: &JsValue) -> Result<JsObj, JsValue> {
if let Ok(array) = value.clone().dyn_into::<Array>() {
Ok(JsObj::Seq(array))
} else if let Ok(obj) = value.clone().dyn_into::<Object>() {
Ok(JsObj::Map(obj))
} else {
Err(to_js_err("obj is not Object or Array"))
}
}

View file

@ -28,10 +28,7 @@
#![allow(clippy::unused_unit)] #![allow(clippy::unused_unit)]
use am::transaction::CommitOptions; use am::transaction::CommitOptions;
use am::transaction::Transactable; use am::transaction::Transactable;
use am::ApplyOptions;
use automerge as am; use automerge as am;
use automerge::Patch;
use automerge::VecOpObserver;
use automerge::{Change, ObjId, Prop, Value, ROOT}; use automerge::{Change, ObjId, Prop, Value, ROOT};
use js_sys::{Array, Object, Uint8Array}; use js_sys::{Array, Object, Uint8Array};
use std::convert::TryInto; use std::convert::TryInto;
@ -39,12 +36,15 @@ use wasm_bindgen::prelude::*;
use wasm_bindgen::JsCast; use wasm_bindgen::JsCast;
mod interop; mod interop;
mod observer;
mod sync; mod sync;
mod value; mod value;
use observer::Observer;
use interop::{ use interop::{
get_heads, js_get, js_set, list_to_js, list_to_js_at, map_to_js, map_to_js_at, to_js_err, apply_patch, get_heads, js_get, js_set, list_to_js, list_to_js_at, map_to_js, map_to_js_at,
to_objtype, to_prop, AR, JS, to_js_err, to_objtype, to_prop, AR, JS,
}; };
use sync::SyncState; use sync::SyncState;
use value::{datatype, ScalarValue}; use value::{datatype, ScalarValue};
@ -56,6 +56,8 @@ macro_rules! log {
}; };
} }
type AutoCommit = am::AutoCommitWithObs<Observer>;
#[cfg(feature = "wee_alloc")] #[cfg(feature = "wee_alloc")]
#[global_allocator] #[global_allocator]
static ALLOC: wee_alloc::WeeAlloc = wee_alloc::WeeAlloc::INIT; static ALLOC: wee_alloc::WeeAlloc = wee_alloc::WeeAlloc::INIT;
@ -63,40 +65,24 @@ static ALLOC: wee_alloc::WeeAlloc = wee_alloc::WeeAlloc::INIT;
#[wasm_bindgen] #[wasm_bindgen]
#[derive(Debug)] #[derive(Debug)]
pub struct Automerge { pub struct Automerge {
doc: automerge::AutoCommit, doc: AutoCommit,
observer: Option<VecOpObserver>,
} }
#[wasm_bindgen] #[wasm_bindgen]
impl Automerge { impl Automerge {
pub fn new(actor: Option<String>) -> Result<Automerge, JsValue> { pub fn new(actor: Option<String>) -> Result<Automerge, JsValue> {
let mut automerge = automerge::AutoCommit::new(); let mut doc = AutoCommit::default();
if let Some(a) = actor { if let Some(a) = actor {
let a = automerge::ActorId::from(hex::decode(a).map_err(to_js_err)?.to_vec()); let a = automerge::ActorId::from(hex::decode(a).map_err(to_js_err)?.to_vec());
automerge.set_actor(a); doc.set_actor(a);
}
Ok(Automerge {
doc: automerge,
observer: None,
})
}
fn ensure_transaction_closed(&mut self) {
if self.doc.pending_ops() > 0 {
let mut opts = CommitOptions::default();
if let Some(observer) = self.observer.as_mut() {
opts.set_op_observer(observer);
}
self.doc.commit_with(opts);
} }
Ok(Automerge { doc })
} }
#[allow(clippy::should_implement_trait)] #[allow(clippy::should_implement_trait)]
pub fn clone(&mut self, actor: Option<String>) -> Result<Automerge, JsValue> { pub fn clone(&mut self, actor: Option<String>) -> Result<Automerge, JsValue> {
self.ensure_transaction_closed();
let mut automerge = Automerge { let mut automerge = Automerge {
doc: self.doc.clone(), doc: self.doc.clone(),
observer: None,
}; };
if let Some(s) = actor { if let Some(s) = actor {
let actor = automerge::ActorId::from(hex::decode(s).map_err(to_js_err)?.to_vec()); let actor = automerge::ActorId::from(hex::decode(s).map_err(to_js_err)?.to_vec());
@ -106,10 +92,8 @@ impl Automerge {
} }
pub fn fork(&mut self, actor: Option<String>) -> Result<Automerge, JsValue> { pub fn fork(&mut self, actor: Option<String>) -> Result<Automerge, JsValue> {
self.ensure_transaction_closed();
let mut automerge = Automerge { let mut automerge = Automerge {
doc: self.doc.fork(), doc: self.doc.fork(),
observer: None,
}; };
if let Some(s) = actor { if let Some(s) = actor {
let actor = automerge::ActorId::from(hex::decode(s).map_err(to_js_err)?.to_vec()); let actor = automerge::ActorId::from(hex::decode(s).map_err(to_js_err)?.to_vec());
@ -123,7 +107,6 @@ impl Automerge {
let deps: Vec<_> = JS(heads).try_into()?; let deps: Vec<_> = JS(heads).try_into()?;
let mut automerge = Automerge { let mut automerge = Automerge {
doc: self.doc.fork_at(&deps)?, doc: self.doc.fork_at(&deps)?,
observer: None,
}; };
if let Some(s) = actor { if let Some(s) = actor {
let actor = automerge::ActorId::from(hex::decode(s).map_err(to_js_err)?.to_vec()); let actor = automerge::ActorId::from(hex::decode(s).map_err(to_js_err)?.to_vec());
@ -147,21 +130,12 @@ impl Automerge {
if let Some(time) = time { if let Some(time) = time {
commit_opts.set_time(time as i64); commit_opts.set_time(time as i64);
} }
if let Some(observer) = self.observer.as_mut() {
commit_opts.set_op_observer(observer);
}
let hash = self.doc.commit_with(commit_opts); let hash = self.doc.commit_with(commit_opts);
JsValue::from_str(&hex::encode(&hash.0)) JsValue::from_str(&hex::encode(&hash.0))
} }
pub fn merge(&mut self, other: &mut Automerge) -> Result<Array, JsValue> { pub fn merge(&mut self, other: &mut Automerge) -> Result<Array, JsValue> {
self.ensure_transaction_closed(); let heads = self.doc.merge(&mut other.doc)?;
let options = if let Some(observer) = self.observer.as_mut() {
ApplyOptions::default().with_op_observer(observer)
} else {
ApplyOptions::default()
};
let heads = self.doc.merge_with(&mut other.doc, options)?;
let heads: Array = heads let heads: Array = heads
.iter() .iter()
.map(|h| JsValue::from_str(&hex::encode(&h.0))) .map(|h| JsValue::from_str(&hex::encode(&h.0)))
@ -453,84 +427,30 @@ impl Automerge {
pub fn enable_patches(&mut self, enable: JsValue) -> Result<(), JsValue> { pub fn enable_patches(&mut self, enable: JsValue) -> Result<(), JsValue> {
let enable = enable let enable = enable
.as_bool() .as_bool()
.ok_or_else(|| to_js_err("expected boolean"))?; .ok_or_else(|| to_js_err("must pass a bool to enable_patches"))?;
if enable { self.doc.observer().enable(enable);
if self.observer.is_none() {
self.observer = Some(VecOpObserver::default());
}
} else {
self.observer = None;
}
Ok(()) Ok(())
} }
#[wasm_bindgen(js_name = applyPatches)]
pub fn apply_patches(&mut self, mut object: JsValue) -> Result<JsValue, JsValue> {
let patches = self.doc.observer().take_patches();
for p in patches {
object = apply_patch(object, &p)?;
}
Ok(object)
}
#[wasm_bindgen(js_name = popPatches)] #[wasm_bindgen(js_name = popPatches)]
pub fn pop_patches(&mut self) -> Result<Array, JsValue> { pub fn pop_patches(&mut self) -> Result<Array, JsValue> {
// transactions send out observer updates as they occur, not waiting for them to be // transactions send out observer updates as they occur, not waiting for them to be
// committed. // committed.
// If we pop the patches then we won't be able to revert them. // If we pop the patches then we won't be able to revert them.
self.ensure_transaction_closed();
let patches = self let patches = self.doc.observer().take_patches();
.observer
.as_mut()
.map_or_else(Vec::new, |o| o.take_patches());
let result = Array::new(); let result = Array::new();
for p in patches { for p in patches {
let patch = Object::new(); result.push(&p.try_into()?);
match p {
Patch::Put {
obj,
key,
value,
conflict,
} => {
js_set(&patch, "action", "put")?;
js_set(&patch, "obj", obj.to_string())?;
js_set(&patch, "key", key)?;
match value {
(Value::Object(obj_type), obj_id) => {
js_set(&patch, "datatype", obj_type.to_string())?;
js_set(&patch, "value", obj_id.to_string())?;
}
(Value::Scalar(value), _) => {
js_set(&patch, "datatype", datatype(&value))?;
js_set(&patch, "value", ScalarValue(value))?;
}
};
js_set(&patch, "conflict", conflict)?;
}
Patch::Insert { obj, index, value } => {
js_set(&patch, "action", "insert")?;
js_set(&patch, "obj", obj.to_string())?;
js_set(&patch, "key", index as f64)?;
match value {
(Value::Object(obj_type), obj_id) => {
js_set(&patch, "datatype", obj_type.to_string())?;
js_set(&patch, "value", obj_id.to_string())?;
}
(Value::Scalar(value), _) => {
js_set(&patch, "datatype", datatype(&value))?;
js_set(&patch, "value", ScalarValue(value))?;
}
};
}
Patch::Increment { obj, key, value } => {
js_set(&patch, "action", "increment")?;
js_set(&patch, "obj", obj.to_string())?;
js_set(&patch, "key", key)?;
js_set(&patch, "value", value.0)?;
}
Patch::Delete { obj, key } => {
js_set(&patch, "action", "delete")?;
js_set(&patch, "obj", obj.to_string())?;
js_set(&patch, "key", key)?;
}
}
result.push(&patch);
} }
Ok(result) Ok(result)
} }
@ -552,51 +472,31 @@ impl Automerge {
} }
pub fn save(&mut self) -> Uint8Array { pub fn save(&mut self) -> Uint8Array {
self.ensure_transaction_closed();
Uint8Array::from(self.doc.save().as_slice()) Uint8Array::from(self.doc.save().as_slice())
} }
#[wasm_bindgen(js_name = saveIncremental)] #[wasm_bindgen(js_name = saveIncremental)]
pub fn save_incremental(&mut self) -> Uint8Array { pub fn save_incremental(&mut self) -> Uint8Array {
self.ensure_transaction_closed();
let bytes = self.doc.save_incremental(); let bytes = self.doc.save_incremental();
Uint8Array::from(bytes.as_slice()) Uint8Array::from(bytes.as_slice())
} }
#[wasm_bindgen(js_name = loadIncremental)] #[wasm_bindgen(js_name = loadIncremental)]
pub fn load_incremental(&mut self, data: Uint8Array) -> Result<f64, JsValue> { pub fn load_incremental(&mut self, data: Uint8Array) -> Result<f64, JsValue> {
self.ensure_transaction_closed();
let data = data.to_vec(); let data = data.to_vec();
let options = if let Some(observer) = self.observer.as_mut() { let len = self.doc.load_incremental(&data).map_err(to_js_err)?;
ApplyOptions::default().with_op_observer(observer)
} else {
ApplyOptions::default()
};
let len = self
.doc
.load_incremental_with(&data, options)
.map_err(to_js_err)?;
Ok(len as f64) Ok(len as f64)
} }
#[wasm_bindgen(js_name = applyChanges)] #[wasm_bindgen(js_name = applyChanges)]
pub fn apply_changes(&mut self, changes: JsValue) -> Result<(), JsValue> { pub fn apply_changes(&mut self, changes: JsValue) -> Result<(), JsValue> {
self.ensure_transaction_closed();
let changes: Vec<_> = JS(changes).try_into()?; let changes: Vec<_> = JS(changes).try_into()?;
let options = if let Some(observer) = self.observer.as_mut() { self.doc.apply_changes(changes).map_err(to_js_err)?;
ApplyOptions::default().with_op_observer(observer)
} else {
ApplyOptions::default()
};
self.doc
.apply_changes_with(changes, options)
.map_err(to_js_err)?;
Ok(()) Ok(())
} }
#[wasm_bindgen(js_name = getChanges)] #[wasm_bindgen(js_name = getChanges)]
pub fn get_changes(&mut self, have_deps: JsValue) -> Result<Array, JsValue> { pub fn get_changes(&mut self, have_deps: JsValue) -> Result<Array, JsValue> {
self.ensure_transaction_closed();
let deps: Vec<_> = JS(have_deps).try_into()?; let deps: Vec<_> = JS(have_deps).try_into()?;
let changes = self.doc.get_changes(&deps)?; let changes = self.doc.get_changes(&deps)?;
let changes: Array = changes let changes: Array = changes
@ -608,7 +508,6 @@ impl Automerge {
#[wasm_bindgen(js_name = getChangeByHash)] #[wasm_bindgen(js_name = getChangeByHash)]
pub fn get_change_by_hash(&mut self, hash: JsValue) -> Result<JsValue, JsValue> { pub fn get_change_by_hash(&mut self, hash: JsValue) -> Result<JsValue, JsValue> {
self.ensure_transaction_closed();
let hash = hash.into_serde().map_err(to_js_err)?; let hash = hash.into_serde().map_err(to_js_err)?;
let change = self.doc.get_change_by_hash(&hash); let change = self.doc.get_change_by_hash(&hash);
if let Some(c) = change { if let Some(c) = change {
@ -620,7 +519,6 @@ impl Automerge {
#[wasm_bindgen(js_name = getChangesAdded)] #[wasm_bindgen(js_name = getChangesAdded)]
pub fn get_changes_added(&mut self, other: &mut Automerge) -> Result<Array, JsValue> { pub fn get_changes_added(&mut self, other: &mut Automerge) -> Result<Array, JsValue> {
self.ensure_transaction_closed();
let changes = self.doc.get_changes_added(&mut other.doc); let changes = self.doc.get_changes_added(&mut other.doc);
let changes: Array = changes let changes: Array = changes
.iter() .iter()
@ -631,7 +529,6 @@ impl Automerge {
#[wasm_bindgen(js_name = getHeads)] #[wasm_bindgen(js_name = getHeads)]
pub fn get_heads(&mut self) -> Array { pub fn get_heads(&mut self) -> Array {
self.ensure_transaction_closed();
let heads = self.doc.get_heads(); let heads = self.doc.get_heads();
let heads: Array = heads let heads: Array = heads
.iter() .iter()
@ -648,7 +545,6 @@ impl Automerge {
#[wasm_bindgen(js_name = getLastLocalChange)] #[wasm_bindgen(js_name = getLastLocalChange)]
pub fn get_last_local_change(&mut self) -> Result<JsValue, JsValue> { pub fn get_last_local_change(&mut self) -> Result<JsValue, JsValue> {
self.ensure_transaction_closed();
if let Some(change) = self.doc.get_last_local_change() { if let Some(change) = self.doc.get_last_local_change() {
Ok(Uint8Array::from(change.raw_bytes()).into()) Ok(Uint8Array::from(change.raw_bytes()).into())
} else { } else {
@ -657,13 +553,11 @@ impl Automerge {
} }
pub fn dump(&mut self) { pub fn dump(&mut self) {
self.ensure_transaction_closed();
self.doc.dump() self.doc.dump()
} }
#[wasm_bindgen(js_name = getMissingDeps)] #[wasm_bindgen(js_name = getMissingDeps)]
pub fn get_missing_deps(&mut self, heads: Option<Array>) -> Result<Array, JsValue> { pub fn get_missing_deps(&mut self, heads: Option<Array>) -> Result<Array, JsValue> {
self.ensure_transaction_closed();
let heads = get_heads(heads).unwrap_or_default(); let heads = get_heads(heads).unwrap_or_default();
let deps = self.doc.get_missing_deps(&heads); let deps = self.doc.get_missing_deps(&heads);
let deps: Array = deps let deps: Array = deps
@ -679,23 +573,16 @@ impl Automerge {
state: &mut SyncState, state: &mut SyncState,
message: Uint8Array, message: Uint8Array,
) -> Result<(), JsValue> { ) -> Result<(), JsValue> {
self.ensure_transaction_closed();
let message = message.to_vec(); let message = message.to_vec();
let message = am::sync::Message::decode(message.as_slice()).map_err(to_js_err)?; let message = am::sync::Message::decode(message.as_slice()).map_err(to_js_err)?;
let options = if let Some(observer) = self.observer.as_mut() {
ApplyOptions::default().with_op_observer(observer)
} else {
ApplyOptions::default()
};
self.doc self.doc
.receive_sync_message_with(&mut state.0, message, options) .receive_sync_message(&mut state.0, message)
.map_err(to_js_err)?; .map_err(to_js_err)?;
Ok(()) Ok(())
} }
#[wasm_bindgen(js_name = generateSyncMessage)] #[wasm_bindgen(js_name = generateSyncMessage)]
pub fn generate_sync_message(&mut self, state: &mut SyncState) -> Result<JsValue, JsValue> { pub fn generate_sync_message(&mut self, state: &mut SyncState) -> Result<JsValue, JsValue> {
self.ensure_transaction_closed();
if let Some(message) = self.doc.generate_sync_message(&mut state.0) { if let Some(message) = self.doc.generate_sync_message(&mut state.0) {
Ok(Uint8Array::from(message.encode().as_slice()).into()) Ok(Uint8Array::from(message.encode().as_slice()).into())
} else { } else {
@ -852,20 +739,15 @@ pub fn init(actor: Option<String>) -> Result<Automerge, JsValue> {
Automerge::new(actor) Automerge::new(actor)
} }
#[wasm_bindgen(js_name = loadDoc)] #[wasm_bindgen(js_name = load)]
pub fn load(data: Uint8Array, actor: Option<String>) -> Result<Automerge, JsValue> { pub fn load(data: Uint8Array, actor: Option<String>) -> Result<Automerge, JsValue> {
let data = data.to_vec(); let data = data.to_vec();
let observer = None; let mut doc = AutoCommit::load(&data).map_err(to_js_err)?;
let options = ApplyOptions::<()>::default();
let mut automerge = am::AutoCommit::load_with(&data, options).map_err(to_js_err)?;
if let Some(s) = actor { if let Some(s) = actor {
let actor = automerge::ActorId::from(hex::decode(s).map_err(to_js_err)?.to_vec()); let actor = automerge::ActorId::from(hex::decode(s).map_err(to_js_err)?.to_vec());
automerge.set_actor(actor); doc.set_actor(actor);
} }
Ok(Automerge { Ok(Automerge { doc })
doc: automerge,
observer,
})
} }
#[wasm_bindgen(js_name = encodeChange)] #[wasm_bindgen(js_name = encodeChange)]

View file

@ -0,0 +1,323 @@
#![allow(dead_code)]
use crate::interop::{export_value, js_set};
use automerge::{ObjId, OpObserver, Parents, Prop, Value};
use js_sys::{Array, Object};
use wasm_bindgen::prelude::*;
#[derive(Debug, Clone, Default)]
pub(crate) struct Observer {
enabled: bool,
patches: Vec<Patch>,
}
impl Observer {
pub(crate) fn take_patches(&mut self) -> Vec<Patch> {
std::mem::take(&mut self.patches)
}
pub(crate) fn enable(&mut self, enable: bool) {
if self.enabled && !enable {
self.patches.truncate(0)
}
self.enabled = enable;
}
fn push(&mut self, patch: Patch) {
if let Some(tail) = self.patches.last_mut() {
if let Some(p) = tail.merge(patch) {
self.patches.push(p)
}
} else {
self.patches.push(patch);
}
}
}
#[derive(Debug, Clone)]
pub(crate) enum Patch {
PutMap {
obj: ObjId,
path: Vec<Prop>,
key: String,
value: Value<'static>,
conflict: bool,
},
PutSeq {
obj: ObjId,
path: Vec<Prop>,
index: usize,
value: Value<'static>,
conflict: bool,
},
Insert {
obj: ObjId,
path: Vec<Prop>,
index: usize,
values: Vec<Value<'static>>,
},
Increment {
obj: ObjId,
path: Vec<Prop>,
prop: Prop,
value: i64,
},
DeleteMap {
obj: ObjId,
path: Vec<Prop>,
key: String,
},
DeleteSeq {
obj: ObjId,
path: Vec<Prop>,
index: usize,
length: usize,
},
}
impl OpObserver for Observer {
fn insert(
&mut self,
mut parents: Parents<'_>,
obj: ObjId,
index: usize,
tagged_value: (Value<'_>, ObjId),
) {
if self.enabled {
// probably want to inline the merge/push code here
let path = parents.path().into_iter().map(|p| p.1).collect();
let value = tagged_value.0.to_owned();
let patch = Patch::Insert {
path,
obj,
index,
values: vec![value],
};
self.push(patch);
}
}
fn put(
&mut self,
mut parents: Parents<'_>,
obj: ObjId,
prop: Prop,
tagged_value: (Value<'_>, ObjId),
conflict: bool,
) {
if self.enabled {
let path = parents.path().into_iter().map(|p| p.1).collect();
let value = tagged_value.0.to_owned();
let patch = match prop {
Prop::Map(key) => Patch::PutMap {
path,
obj,
key,
value,
conflict,
},
Prop::Seq(index) => Patch::PutSeq {
path,
obj,
index,
value,
conflict,
},
};
self.patches.push(patch);
}
}
fn increment(
&mut self,
mut parents: Parents<'_>,
obj: ObjId,
prop: Prop,
tagged_value: (i64, ObjId),
) {
if self.enabled {
let path = parents.path().into_iter().map(|p| p.1).collect();
let value = tagged_value.0;
self.patches.push(Patch::Increment {
path,
obj,
prop,
value,
})
}
}
fn delete(&mut self, mut parents: Parents<'_>, obj: ObjId, prop: Prop) {
if self.enabled {
let path = parents.path().into_iter().map(|p| p.1).collect();
let patch = match prop {
Prop::Map(key) => Patch::DeleteMap { path, obj, key },
Prop::Seq(index) => Patch::DeleteSeq {
path,
obj,
index,
length: 1,
},
};
self.patches.push(patch)
}
}
fn merge(&mut self, other: &Self) {
self.patches.extend_from_slice(other.patches.as_slice())
}
fn branch(&self) -> Self {
Observer {
patches: vec![],
enabled: self.enabled,
}
}
}
fn prop_to_js(p: &Prop) -> JsValue {
match p {
Prop::Map(key) => JsValue::from_str(key),
Prop::Seq(index) => JsValue::from_f64(*index as f64),
}
}
fn export_path(path: &[Prop], end: &Prop) -> Array {
let result = Array::new();
for p in path {
result.push(&prop_to_js(p));
}
result.push(&prop_to_js(end));
result
}
impl Patch {
pub(crate) fn path(&self) -> &[Prop] {
match &self {
Self::PutMap { path, .. } => path.as_slice(),
Self::PutSeq { path, .. } => path.as_slice(),
Self::Increment { path, .. } => path.as_slice(),
Self::Insert { path, .. } => path.as_slice(),
Self::DeleteMap { path, .. } => path.as_slice(),
Self::DeleteSeq { path, .. } => path.as_slice(),
}
}
fn merge(&mut self, other: Patch) -> Option<Patch> {
match (self, &other) {
(
Self::Insert {
obj, index, values, ..
},
Self::Insert {
obj: o2,
values: v2,
index: i2,
..
},
) if obj == o2 && *index + values.len() == *i2 => {
// TODO - there's a way to do this without the clone im sure
values.extend_from_slice(v2.as_slice());
//web_sys::console::log_2(&format!("NEW VAL {}: ", tmpi).into(), &new_value);
None
}
_ => Some(other),
}
}
}
impl TryFrom<Patch> for JsValue {
type Error = JsValue;
fn try_from(p: Patch) -> Result<Self, Self::Error> {
let result = Object::new();
match p {
Patch::PutMap {
path,
key,
value,
conflict,
..
} => {
js_set(&result, "action", "put")?;
js_set(
&result,
"path",
export_path(path.as_slice(), &Prop::Map(key)),
)?;
js_set(&result, "value", export_value(&value))?;
js_set(&result, "conflict", &JsValue::from_bool(conflict))?;
Ok(result.into())
}
Patch::PutSeq {
path,
index,
value,
conflict,
..
} => {
js_set(&result, "action", "put")?;
js_set(
&result,
"path",
export_path(path.as_slice(), &Prop::Seq(index)),
)?;
js_set(&result, "value", export_value(&value))?;
js_set(&result, "conflict", &JsValue::from_bool(conflict))?;
Ok(result.into())
}
Patch::Insert {
path,
index,
values,
..
} => {
js_set(&result, "action", "splice")?;
js_set(
&result,
"path",
export_path(path.as_slice(), &Prop::Seq(index)),
)?;
js_set(
&result,
"values",
values.iter().map(export_value).collect::<Array>(),
)?;
Ok(result.into())
}
Patch::Increment {
path, prop, value, ..
} => {
js_set(&result, "action", "inc")?;
js_set(&result, "path", export_path(path.as_slice(), &prop))?;
js_set(&result, "value", &JsValue::from_f64(value as f64))?;
Ok(result.into())
}
Patch::DeleteMap { path, key, .. } => {
js_set(&result, "action", "del")?;
js_set(
&result,
"path",
export_path(path.as_slice(), &Prop::Map(key)),
)?;
Ok(result.into())
}
Patch::DeleteSeq {
path,
index,
length,
..
} => {
js_set(&result, "action", "del")?;
js_set(
&result,
"path",
export_path(path.as_slice(), &Prop::Seq(index)),
)?;
if length > 1 {
js_set(&result, "length", length)?;
}
Ok(result.into())
}
}
}
}

View file

@ -0,0 +1,100 @@
import { describe, it } from 'mocha';
//@ts-ignore
import assert from 'assert'
//@ts-ignore
import init, { create, load } from '..'
describe('Automerge', () => {
describe('Patch Apply', () => {
it('apply nested sets on maps', () => {
let start : any = { hello: { mellow: { yellow: "world", x: 1 }, y : 2 } }
let doc1 = create()
doc1.putObject("/", "hello", start.hello);
let mat = doc1.materialize("/")
let doc2 = create()
doc2.enablePatches(true)
doc2.merge(doc1)
let base = doc2.applyPatches({})
assert.deepEqual(mat, start)
assert.deepEqual(base, start)
doc2.delete("/hello/mellow", "yellow");
delete start.hello.mellow.yellow;
base = doc2.applyPatches(base)
mat = doc2.materialize("/")
assert.deepEqual(mat, start)
assert.deepEqual(base, start)
})
it('apply patches on lists', () => {
//let start = { list: [1,2,3,4,5,6] }
let start = { list: [1,2,3,4] }
let doc1 = create()
doc1.putObject("/", "list", start.list);
let mat = doc1.materialize("/")
let doc2 = create()
doc2.enablePatches(true)
doc2.merge(doc1)
mat = doc1.materialize("/")
let base = doc2.applyPatches({})
assert.deepEqual(mat, start)
assert.deepEqual(base, start)
doc2.delete("/list", 3);
start.list.splice(3,1)
base = doc2.applyPatches(base)
assert.deepEqual(base, start)
})
it('apply patches on lists of lists of lists', () => {
let start = { list:
[
[
[ 1, 2, 3, 4, 5, 6],
[ 7, 8, 9,10,11,12],
],
[
[ 7, 8, 9,10,11,12],
[ 1, 2, 3, 4, 5, 6],
]
]
}
let doc1 = create()
doc1.enablePatches(true)
doc1.putObject("/", "list", start.list);
let mat = doc1.materialize("/")
let base = doc1.applyPatches({})
assert.deepEqual(mat, start)
doc1.delete("/list/0/1", 3)
start.list[0][1].splice(3,1)
doc1.delete("/list/0", 0)
start.list[0].splice(0,1)
mat = doc1.materialize("/")
base = doc1.applyPatches(base)
assert.deepEqual(mat, start)
assert.deepEqual(base, start)
})
it('large inserts should make one splice patch', () => {
let doc1 = create()
doc1.enablePatches(true)
doc1.putObject("/", "list", "abc");
let patches = doc1.popPatches()
assert.deepEqual( patches, [
{ action: 'put', conflict: false, path: [ 'list' ], value: [] },
{ action: 'splice', path: [ 'list', 0 ], values: [ 'a', 'b', 'c' ] }])
})
})
})
// FIXME: handle conflicts correctly on apply
// TODO: squash puts
// TODO: merge deletes
// TODO: elide `conflict: false`

View file

@ -1,7 +1,7 @@
import { describe, it } from 'mocha'; import { describe, it } from 'mocha';
import * as assert from 'assert' import * as assert from 'assert'
//@ts-ignore //@ts-ignore
import { init, create, load } from '..' import { create, load } from '..'
describe('Automerge', () => { describe('Automerge', () => {
describe('Readme Examples', () => { describe('Readme Examples', () => {
@ -10,11 +10,9 @@ describe('Automerge', () => {
doc.free() doc.free()
}) })
it('Using the Library and Creating a Document (2)', (done) => { it('Using the Library and Creating a Document (2)', (done) => {
init().then((_:any) => { const doc = create()
const doc = create() doc.free()
doc.free() done()
done()
})
}) })
it('Automerge Scalar Types (1)', () => { it('Automerge Scalar Types (1)', () => {
const doc = create() const doc = create()

View file

@ -3,7 +3,7 @@ import { describe, it } from 'mocha';
import assert from 'assert' import assert from 'assert'
//@ts-ignore //@ts-ignore
import { BloomFilter } from './helpers/sync' import { BloomFilter } from './helpers/sync'
import { init, create, load, SyncState, Automerge, encodeChange, decodeChange, initSyncState, decodeSyncMessage, decodeSyncState, encodeSyncState, encodeSyncMessage } from '..' import { create, load, SyncState, Automerge, encodeChange, decodeChange, initSyncState, decodeSyncMessage, decodeSyncState, encodeSyncState, encodeSyncMessage } from '..'
import { DecodedSyncMessage, Hash } from '..'; import { DecodedSyncMessage, Hash } from '..';
function sync(a: Automerge, b: Automerge, aSyncState = initSyncState(), bSyncState = initSyncState()) { function sync(a: Automerge, b: Automerge, aSyncState = initSyncState(), bSyncState = initSyncState()) {
@ -28,9 +28,6 @@ function sync(a: Automerge, b: Automerge, aSyncState = initSyncState(), bSyncSta
describe('Automerge', () => { describe('Automerge', () => {
describe('basics', () => { describe('basics', () => {
it('default import init() should return a promise', () => {
assert(init() instanceof Promise)
})
it('should create, clone and free', () => { it('should create, clone and free', () => {
const doc1 = create() const doc1 = create()
@ -506,7 +503,7 @@ describe('Automerge', () => {
doc2.enablePatches(true) doc2.enablePatches(true)
doc2.loadIncremental(doc1.saveIncremental()) doc2.loadIncremental(doc1.saveIncremental())
assert.deepEqual(doc2.popPatches(), [ assert.deepEqual(doc2.popPatches(), [
{ action: 'put', obj: '_root', key: 'hello', value: 'world', datatype: 'str', conflict: false } { action: 'put', path: ['hello'], value: 'world', conflict: false }
]) ])
doc1.free() doc1.free()
doc2.free() doc2.free()
@ -518,9 +515,9 @@ describe('Automerge', () => {
doc2.enablePatches(true) doc2.enablePatches(true)
doc2.loadIncremental(doc1.saveIncremental()) doc2.loadIncremental(doc1.saveIncremental())
assert.deepEqual(doc2.popPatches(), [ assert.deepEqual(doc2.popPatches(), [
{ action: 'put', obj: '_root', key: 'birds', value: '1@aaaa', datatype: 'map', conflict: false }, { action: 'put', path: [ 'birds' ], value: {}, conflict: false },
{ action: 'put', obj: '1@aaaa', key: 'friday', value: '2@aaaa', datatype: 'map', conflict: false }, { action: 'put', path: [ 'birds', 'friday' ], value: {}, conflict: false },
{ action: 'put', obj: '2@aaaa', key: 'robins', value: 3, datatype: 'int', conflict: false } { action: 'put', path: [ 'birds', 'friday', 'robins' ], value: 3, conflict: false},
]) ])
doc1.free() doc1.free()
doc2.free() doc2.free()
@ -534,8 +531,8 @@ describe('Automerge', () => {
doc1.delete('_root', 'favouriteBird') doc1.delete('_root', 'favouriteBird')
doc2.loadIncremental(doc1.saveIncremental()) doc2.loadIncremental(doc1.saveIncremental())
assert.deepEqual(doc2.popPatches(), [ assert.deepEqual(doc2.popPatches(), [
{ action: 'put', obj: '_root', key: 'favouriteBird', value: 'Robin', datatype: 'str', conflict: false }, { action: 'put', path: [ 'favouriteBird' ], value: 'Robin', conflict: false },
{ action: 'delete', obj: '_root', key: 'favouriteBird' } { action: 'del', path: [ 'favouriteBird' ] }
]) ])
doc1.free() doc1.free()
doc2.free() doc2.free()
@ -547,9 +544,8 @@ describe('Automerge', () => {
doc2.enablePatches(true) doc2.enablePatches(true)
doc2.loadIncremental(doc1.saveIncremental()) doc2.loadIncremental(doc1.saveIncremental())
assert.deepEqual(doc2.popPatches(), [ assert.deepEqual(doc2.popPatches(), [
{ action: 'put', obj: '_root', key: 'birds', value: '1@aaaa', datatype: 'list', conflict: false }, { action: 'put', path: [ 'birds' ], value: [], conflict: false },
{ action: 'insert', obj: '1@aaaa', key: 0, value: 'Goldfinch', datatype: 'str' }, { action: 'splice', path: [ 'birds', 0 ], values: ['Goldfinch', 'Chaffinch'] },
{ action: 'insert', obj: '1@aaaa', key: 1, value: 'Chaffinch', datatype: 'str' }
]) ])
doc1.free() doc1.free()
doc2.free() doc2.free()
@ -563,9 +559,9 @@ describe('Automerge', () => {
doc2.enablePatches(true) doc2.enablePatches(true)
doc2.loadIncremental(doc1.saveIncremental()) doc2.loadIncremental(doc1.saveIncremental())
assert.deepEqual(doc2.popPatches(), [ assert.deepEqual(doc2.popPatches(), [
{ action: 'insert', obj: '1@aaaa', key: 0, value: '2@aaaa', datatype: 'map' }, { action: 'splice', path: [ 'birds', 0 ], values: [{}] },
{ action: 'put', obj: '2@aaaa', key: 'species', value: 'Goldfinch', datatype: 'str', conflict: false }, { action: 'put', path: [ 'birds', 0, 'species' ], value: 'Goldfinch', conflict: false },
{ action: 'put', obj: '2@aaaa', key: 'count', value: 3, datatype: 'int', conflict: false } { action: 'put', path: [ 'birds', 0, 'count', ], value: 3, conflict: false }
]) ])
doc1.free() doc1.free()
doc2.free() doc2.free()
@ -582,8 +578,8 @@ describe('Automerge', () => {
assert.deepEqual(doc1.getWithType('1@aaaa', 0), ['str', 'Chaffinch']) assert.deepEqual(doc1.getWithType('1@aaaa', 0), ['str', 'Chaffinch'])
assert.deepEqual(doc1.getWithType('1@aaaa', 1), ['str', 'Greenfinch']) assert.deepEqual(doc1.getWithType('1@aaaa', 1), ['str', 'Greenfinch'])
assert.deepEqual(doc2.popPatches(), [ assert.deepEqual(doc2.popPatches(), [
{ action: 'delete', obj: '1@aaaa', key: 0 }, { action: 'del', path: ['birds', 0] },
{ action: 'insert', obj: '1@aaaa', key: 1, value: 'Greenfinch', datatype: 'str' } { action: 'splice', path: ['birds', 1], values: ['Greenfinch'] }
]) ])
doc1.free() doc1.free()
doc2.free() doc2.free()
@ -608,16 +604,11 @@ describe('Automerge', () => {
assert.deepEqual([0, 1, 2, 3].map(i => (doc3.getWithType('1@aaaa', i) || [])[1]), ['a', 'b', 'c', 'd']) assert.deepEqual([0, 1, 2, 3].map(i => (doc3.getWithType('1@aaaa', i) || [])[1]), ['a', 'b', 'c', 'd'])
assert.deepEqual([0, 1, 2, 3].map(i => (doc4.getWithType('1@aaaa', i) || [])[1]), ['a', 'b', 'c', 'd']) assert.deepEqual([0, 1, 2, 3].map(i => (doc4.getWithType('1@aaaa', i) || [])[1]), ['a', 'b', 'c', 'd'])
assert.deepEqual(doc3.popPatches(), [ assert.deepEqual(doc3.popPatches(), [
{ action: 'insert', obj: '1@aaaa', key: 0, value: 'c', datatype: 'str' }, { action: 'splice', path: ['values', 0], values:['c','d'] },
{ action: 'insert', obj: '1@aaaa', key: 1, value: 'd', datatype: 'str' }, { action: 'splice', path: ['values', 0], values:['a','b'] },
{ action: 'insert', obj: '1@aaaa', key: 0, value: 'a', datatype: 'str' },
{ action: 'insert', obj: '1@aaaa', key: 1, value: 'b', datatype: 'str' }
]) ])
assert.deepEqual(doc4.popPatches(), [ assert.deepEqual(doc4.popPatches(), [
{ action: 'insert', obj: '1@aaaa', key: 0, value: 'a', datatype: 'str' }, { action: 'splice', path: ['values',0], values:['a','b','c','d'] },
{ action: 'insert', obj: '1@aaaa', key: 1, value: 'b', datatype: 'str' },
{ action: 'insert', obj: '1@aaaa', key: 2, value: 'c', datatype: 'str' },
{ action: 'insert', obj: '1@aaaa', key: 3, value: 'd', datatype: 'str' }
]) ])
doc1.free(); doc2.free(); doc3.free(); doc4.free() doc1.free(); doc2.free(); doc3.free(); doc4.free()
}) })
@ -641,16 +632,11 @@ describe('Automerge', () => {
assert.deepEqual([0, 1, 2, 3, 4, 5].map(i => (doc3.getWithType('1@aaaa', i) || [])[1]), ['a', 'b', 'c', 'd', 'e', 'f']) assert.deepEqual([0, 1, 2, 3, 4, 5].map(i => (doc3.getWithType('1@aaaa', i) || [])[1]), ['a', 'b', 'c', 'd', 'e', 'f'])
assert.deepEqual([0, 1, 2, 3, 4, 5].map(i => (doc4.getWithType('1@aaaa', i) || [])[1]), ['a', 'b', 'c', 'd', 'e', 'f']) assert.deepEqual([0, 1, 2, 3, 4, 5].map(i => (doc4.getWithType('1@aaaa', i) || [])[1]), ['a', 'b', 'c', 'd', 'e', 'f'])
assert.deepEqual(doc3.popPatches(), [ assert.deepEqual(doc3.popPatches(), [
{ action: 'insert', obj: '1@aaaa', key: 2, value: 'e', datatype: 'str' }, { action: 'splice', path: ['values', 2], values: ['e','f'] },
{ action: 'insert', obj: '1@aaaa', key: 3, value: 'f', datatype: 'str' }, { action: 'splice', path: ['values', 2], values: ['c','d'] },
{ action: 'insert', obj: '1@aaaa', key: 2, value: 'c', datatype: 'str' },
{ action: 'insert', obj: '1@aaaa', key: 3, value: 'd', datatype: 'str' }
]) ])
assert.deepEqual(doc4.popPatches(), [ assert.deepEqual(doc4.popPatches(), [
{ action: 'insert', obj: '1@aaaa', key: 2, value: 'c', datatype: 'str' }, { action: 'splice', path: ['values', 2], values: ['c','d','e','f'] },
{ action: 'insert', obj: '1@aaaa', key: 3, value: 'd', datatype: 'str' },
{ action: 'insert', obj: '1@aaaa', key: 4, value: 'e', datatype: 'str' },
{ action: 'insert', obj: '1@aaaa', key: 5, value: 'f', datatype: 'str' }
]) ])
doc1.free(); doc2.free(); doc3.free(); doc4.free() doc1.free(); doc2.free(); doc3.free(); doc4.free()
}) })
@ -669,12 +655,12 @@ describe('Automerge', () => {
assert.deepEqual(doc4.getWithType('_root', 'bird'), ['str', 'Goldfinch']) assert.deepEqual(doc4.getWithType('_root', 'bird'), ['str', 'Goldfinch'])
assert.deepEqual(doc4.getAll('_root', 'bird'), [['str', 'Greenfinch', '1@aaaa'], ['str', 'Goldfinch', '1@bbbb']]) assert.deepEqual(doc4.getAll('_root', 'bird'), [['str', 'Greenfinch', '1@aaaa'], ['str', 'Goldfinch', '1@bbbb']])
assert.deepEqual(doc3.popPatches(), [ assert.deepEqual(doc3.popPatches(), [
{ action: 'put', obj: '_root', key: 'bird', value: 'Greenfinch', datatype: 'str', conflict: false }, { action: 'put', path: ['bird'], value: 'Greenfinch', conflict: false },
{ action: 'put', obj: '_root', key: 'bird', value: 'Goldfinch', datatype: 'str', conflict: true } { action: 'put', path: ['bird'], value: 'Goldfinch', conflict: true },
]) ])
assert.deepEqual(doc4.popPatches(), [ assert.deepEqual(doc4.popPatches(), [
{ action: 'put', obj: '_root', key: 'bird', value: 'Goldfinch', datatype: 'str', conflict: false }, { action: 'put', path: ['bird'], value: 'Goldfinch', conflict: false },
{ action: 'put', obj: '_root', key: 'bird', value: 'Goldfinch', datatype: 'str', conflict: true } { action: 'put', path: ['bird'], value: 'Goldfinch', conflict: true },
]) ])
doc1.free(); doc2.free(); doc3.free(); doc4.free() doc1.free(); doc2.free(); doc3.free(); doc4.free()
}) })
@ -704,16 +690,16 @@ describe('Automerge', () => {
['str', 'Greenfinch', '1@aaaa'], ['str', 'Chaffinch', '1@bbbb'], ['str', 'Goldfinch', '1@cccc'] ['str', 'Greenfinch', '1@aaaa'], ['str', 'Chaffinch', '1@bbbb'], ['str', 'Goldfinch', '1@cccc']
]) ])
assert.deepEqual(doc1.popPatches(), [ assert.deepEqual(doc1.popPatches(), [
{ action: 'put', obj: '_root', key: 'bird', value: 'Chaffinch', datatype: 'str', conflict: true }, { action: 'put', path: ['bird'], value: 'Chaffinch', conflict: true },
{ action: 'put', obj: '_root', key: 'bird', value: 'Goldfinch', datatype: 'str', conflict: true } { action: 'put', path: ['bird'], value: 'Goldfinch', conflict: true }
]) ])
assert.deepEqual(doc2.popPatches(), [ assert.deepEqual(doc2.popPatches(), [
{ action: 'put', obj: '_root', key: 'bird', value: 'Goldfinch', datatype: 'str', conflict: true }, { action: 'put', path: ['bird'], value: 'Goldfinch', conflict: true },
{ action: 'put', obj: '_root', key: 'bird', value: 'Goldfinch', datatype: 'str', conflict: true } { action: 'put', path: ['bird'], value: 'Goldfinch', conflict: true }
]) ])
assert.deepEqual(doc3.popPatches(), [ assert.deepEqual(doc3.popPatches(), [
{ action: 'put', obj: '_root', key: 'bird', value: 'Goldfinch', datatype: 'str', conflict: true }, { action: 'put', path: ['bird'], value: 'Goldfinch', conflict: true },
{ action: 'put', obj: '_root', key: 'bird', value: 'Goldfinch', datatype: 'str', conflict: true } { action: 'put', path: ['bird'], value: 'Goldfinch', conflict: true }
]) ])
doc1.free(); doc2.free(); doc3.free() doc1.free(); doc2.free(); doc3.free()
}) })
@ -730,9 +716,9 @@ describe('Automerge', () => {
doc3.loadIncremental(doc1.saveIncremental()) doc3.loadIncremental(doc1.saveIncremental())
assert.deepEqual(doc3.getAll('_root', 'bird'), [['str', 'Goldfinch', '2@aaaa']]) assert.deepEqual(doc3.getAll('_root', 'bird'), [['str', 'Goldfinch', '2@aaaa']])
assert.deepEqual(doc3.popPatches(), [ assert.deepEqual(doc3.popPatches(), [
{ action: 'put', obj: '_root', key: 'bird', value: 'Greenfinch', datatype: 'str', conflict: false }, { action: 'put', path: ['bird'], value: 'Greenfinch', conflict: false },
{ action: 'put', obj: '_root', key: 'bird', value: 'Chaffinch', datatype: 'str', conflict: true }, { action: 'put', path: ['bird'], value: 'Chaffinch', conflict: true },
{ action: 'put', obj: '_root', key: 'bird', value: 'Goldfinch', datatype: 'str', conflict: false } { action: 'put', path: ['bird'], value: 'Goldfinch', conflict: false }
]) ])
doc1.free(); doc2.free(); doc3.free() doc1.free(); doc2.free(); doc3.free()
}) })
@ -753,10 +739,10 @@ describe('Automerge', () => {
assert.deepEqual(doc2.getWithType('_root', 'bird'), ['str', 'Goldfinch']) assert.deepEqual(doc2.getWithType('_root', 'bird'), ['str', 'Goldfinch'])
assert.deepEqual(doc2.getAll('_root', 'bird'), [['str', 'Goldfinch', '2@aaaa']]) assert.deepEqual(doc2.getAll('_root', 'bird'), [['str', 'Goldfinch', '2@aaaa']])
assert.deepEqual(doc1.popPatches(), [ assert.deepEqual(doc1.popPatches(), [
{ action: 'put', obj: '_root', key: 'bird', value: 'Goldfinch', datatype: 'str', conflict: false } { action: 'put', path: ['bird'], value: 'Goldfinch', conflict: false }
]) ])
assert.deepEqual(doc2.popPatches(), [ assert.deepEqual(doc2.popPatches(), [
{ action: 'put', obj: '_root', key: 'bird', value: 'Goldfinch', datatype: 'str', conflict: false } { action: 'put', path: ['bird'], value: 'Goldfinch', conflict: false }
]) ])
doc1.free(); doc2.free() doc1.free(); doc2.free()
}) })
@ -780,12 +766,12 @@ describe('Automerge', () => {
assert.deepEqual(doc4.getWithType('1@aaaa', 0), ['str', 'Redwing']) assert.deepEqual(doc4.getWithType('1@aaaa', 0), ['str', 'Redwing'])
assert.deepEqual(doc4.getAll('1@aaaa', 0), [['str', 'Song Thrush', '4@aaaa'], ['str', 'Redwing', '4@bbbb']]) assert.deepEqual(doc4.getAll('1@aaaa', 0), [['str', 'Song Thrush', '4@aaaa'], ['str', 'Redwing', '4@bbbb']])
assert.deepEqual(doc3.popPatches(), [ assert.deepEqual(doc3.popPatches(), [
{ action: 'put', obj: '1@aaaa', key: 0, value: 'Song Thrush', datatype: 'str', conflict: false }, { action: 'put', path: ['birds',0], value: 'Song Thrush', conflict: false },
{ action: 'put', obj: '1@aaaa', key: 0, value: 'Redwing', datatype: 'str', conflict: true } { action: 'put', path: ['birds',0], value: 'Redwing', conflict: true }
]) ])
assert.deepEqual(doc4.popPatches(), [ assert.deepEqual(doc4.popPatches(), [
{ action: 'put', obj: '1@aaaa', key: 0, value: 'Redwing', datatype: 'str', conflict: false }, { action: 'put', path: ['birds',0], value: 'Redwing', conflict: false },
{ action: 'put', obj: '1@aaaa', key: 0, value: 'Redwing', datatype: 'str', conflict: true } { action: 'put', path: ['birds',0], value: 'Redwing', conflict: true }
]) ])
doc1.free(); doc2.free(); doc3.free(); doc4.free() doc1.free(); doc2.free(); doc3.free(); doc4.free()
}) })
@ -811,16 +797,16 @@ describe('Automerge', () => {
assert.deepEqual(doc4.getAll('1@aaaa', 0), [['str', 'Ring-necked parakeet', '5@bbbb']]) assert.deepEqual(doc4.getAll('1@aaaa', 0), [['str', 'Ring-necked parakeet', '5@bbbb']])
assert.deepEqual(doc4.getAll('1@aaaa', 2), [['str', 'Song Thrush', '6@aaaa'], ['str', 'Redwing', '6@bbbb']]) assert.deepEqual(doc4.getAll('1@aaaa', 2), [['str', 'Song Thrush', '6@aaaa'], ['str', 'Redwing', '6@bbbb']])
assert.deepEqual(doc3.popPatches(), [ assert.deepEqual(doc3.popPatches(), [
{ action: 'delete', obj: '1@aaaa', key: 0 }, { action: 'del', path: ['birds',0], },
{ action: 'put', obj: '1@aaaa', key: 1, value: 'Song Thrush', datatype: 'str', conflict: false }, { action: 'put', path: ['birds',1], value: 'Song Thrush', conflict: false },
{ action: 'insert', obj: '1@aaaa', key: 0, value: 'Ring-necked parakeet', datatype: 'str' }, { action: 'splice', path: ['birds',0], values: ['Ring-necked parakeet'] },
{ action: 'put', obj: '1@aaaa', key: 2, value: 'Redwing', datatype: 'str', conflict: true } { action: 'put', path: ['birds',2], value: 'Redwing', conflict: true }
]) ])
assert.deepEqual(doc4.popPatches(), [ assert.deepEqual(doc4.popPatches(), [
{ action: 'put', obj: '1@aaaa', key: 0, value: 'Ring-necked parakeet', datatype: 'str', conflict: false }, { action: 'put', path: ['birds',0], value: 'Ring-necked parakeet', conflict: false },
{ action: 'put', obj: '1@aaaa', key: 2, value: 'Redwing', datatype: 'str', conflict: false }, { action: 'put', path: ['birds',2], value: 'Redwing', conflict: false },
{ action: 'put', obj: '1@aaaa', key: 0, value: 'Ring-necked parakeet', datatype: 'str', conflict: false }, { action: 'put', path: ['birds',0], value: 'Ring-necked parakeet', conflict: false },
{ action: 'put', obj: '1@aaaa', key: 2, value: 'Redwing', datatype: 'str', conflict: true } { action: 'put', path: ['birds',2], value: 'Redwing', conflict: true }
]) ])
doc1.free(); doc2.free(); doc3.free(); doc4.free() doc1.free(); doc2.free(); doc3.free(); doc4.free()
}) })
@ -837,14 +823,14 @@ describe('Automerge', () => {
doc3.loadIncremental(change2) doc3.loadIncremental(change2)
assert.deepEqual(doc3.getAll('_root', 'bird'), [['str', 'Robin', '1@aaaa'], ['str', 'Wren', '1@bbbb']]) assert.deepEqual(doc3.getAll('_root', 'bird'), [['str', 'Robin', '1@aaaa'], ['str', 'Wren', '1@bbbb']])
assert.deepEqual(doc3.popPatches(), [ assert.deepEqual(doc3.popPatches(), [
{ action: 'put', obj: '_root', key: 'bird', value: 'Robin', datatype: 'str', conflict: false }, { action: 'put', path: ['bird'], value: 'Robin', conflict: false },
{ action: 'put', obj: '_root', key: 'bird', value: 'Wren', datatype: 'str', conflict: true } { action: 'put', path: ['bird'], value: 'Wren', conflict: true }
]) ])
doc3.loadIncremental(change3) doc3.loadIncremental(change3)
assert.deepEqual(doc3.getWithType('_root', 'bird'), ['str', 'Robin']) assert.deepEqual(doc3.getWithType('_root', 'bird'), ['str', 'Robin'])
assert.deepEqual(doc3.getAll('_root', 'bird'), [['str', 'Robin', '1@aaaa']]) assert.deepEqual(doc3.getAll('_root', 'bird'), [['str', 'Robin', '1@aaaa']])
assert.deepEqual(doc3.popPatches(), [ assert.deepEqual(doc3.popPatches(), [
{ action: 'put', obj: '_root', key: 'bird', value: 'Robin', datatype: 'str', conflict: false } { action: 'put', path: ['bird'], value: 'Robin', conflict: false }
]) ])
doc1.free(); doc2.free(); doc3.free() doc1.free(); doc2.free(); doc3.free()
}) })
@ -860,26 +846,25 @@ describe('Automerge', () => {
doc2.loadIncremental(change1) doc2.loadIncremental(change1)
assert.deepEqual(doc1.getAll('_root', 'birds'), [['list', '1@aaaa'], ['map', '1@bbbb']]) assert.deepEqual(doc1.getAll('_root', 'birds'), [['list', '1@aaaa'], ['map', '1@bbbb']])
assert.deepEqual(doc1.popPatches(), [ assert.deepEqual(doc1.popPatches(), [
{ action: 'put', obj: '_root', key: 'birds', value: '1@bbbb', datatype: 'map', conflict: true }, { action: 'put', path: ['birds'], value: {}, conflict: true },
{ action: 'put', obj: '1@bbbb', key: 'Sparrowhawk', value: 1, datatype: 'int', conflict: false } { action: 'put', path: ['birds', 'Sparrowhawk'], value: 1, conflict: false }
]) ])
assert.deepEqual(doc2.getAll('_root', 'birds'), [['list', '1@aaaa'], ['map', '1@bbbb']]) assert.deepEqual(doc2.getAll('_root', 'birds'), [['list', '1@aaaa'], ['map', '1@bbbb']])
assert.deepEqual(doc2.popPatches(), [ assert.deepEqual(doc2.popPatches(), [
{ action: 'put', obj: '_root', key: 'birds', value: '1@bbbb', datatype: 'map', conflict: true }, { action: 'put', path: ['birds'], value: {}, conflict: true },
{ action: 'insert', obj: '1@aaaa', key: 0, value: 'Parakeet', datatype: 'str' } { action: 'splice', path: ['birds',0], values: ['Parakeet'] }
]) ])
doc1.free(); doc2.free() doc1.free(); doc2.free()
}) })
it('should support date objects', () => { it('should support date objects', () => {
// FIXME: either use Date objects or use numbers consistently
const doc1 = create('aaaa'), doc2 = create('bbbb'), now = new Date() const doc1 = create('aaaa'), doc2 = create('bbbb'), now = new Date()
doc1.put('_root', 'createdAt', now.getTime(), 'timestamp') doc1.put('_root', 'createdAt', now)
doc2.enablePatches(true) doc2.enablePatches(true)
doc2.loadIncremental(doc1.saveIncremental()) doc2.loadIncremental(doc1.saveIncremental())
assert.deepEqual(doc2.getWithType('_root', 'createdAt'), ['timestamp', now]) assert.deepEqual(doc2.getWithType('_root', 'createdAt'), ['timestamp', now])
assert.deepEqual(doc2.popPatches(), [ assert.deepEqual(doc2.popPatches(), [
{ action: 'put', obj: '_root', key: 'createdAt', value: now, datatype: 'timestamp', conflict: false } { action: 'put', path: ['createdAt'], value: now, conflict: false }
]) ])
doc1.free(); doc2.free() doc1.free(); doc2.free()
}) })
@ -894,11 +879,11 @@ describe('Automerge', () => {
const list = doc1.putObject('_root', 'list', []) const list = doc1.putObject('_root', 'list', [])
assert.deepEqual(doc1.popPatches(), [ assert.deepEqual(doc1.popPatches(), [
{ action: 'put', obj: '_root', key: 'key1', value: 1, datatype: 'int', conflict: false }, { action: 'put', path: ['key1'], value: 1, conflict: false },
{ action: 'put', obj: '_root', key: 'key1', value: 2, datatype: 'int', conflict: false }, { action: 'put', path: ['key1'], value: 2, conflict: false },
{ action: 'put', obj: '_root', key: 'key2', value: 3, datatype: 'int', conflict: false }, { action: 'put', path: ['key2'], value: 3, conflict: false },
{ action: 'put', obj: '_root', key: 'map', value: map, datatype: 'map', conflict: false }, { action: 'put', path: ['map'], value: {}, conflict: false },
{ action: 'put', obj: '_root', key: 'list', value: list, datatype: 'list', conflict: false }, { action: 'put', path: ['list'], value: [], conflict: false },
]) ])
doc1.free() doc1.free()
}) })
@ -914,12 +899,12 @@ describe('Automerge', () => {
const list2 = doc1.insertObject(list, 2, []) const list2 = doc1.insertObject(list, 2, [])
assert.deepEqual(doc1.popPatches(), [ assert.deepEqual(doc1.popPatches(), [
{ action: 'put', obj: '_root', key: 'list', value: list, datatype: 'list', conflict: false }, { action: 'put', path: ['list'], value: [], conflict: false },
{ action: 'insert', obj: list, key: 0, value: 1, datatype: 'int' }, { action: 'splice', path: ['list', 0], values: [1] },
{ action: 'insert', obj: list, key: 0, value: 2, datatype: 'int' }, { action: 'splice', path: ['list', 0], values: [2] },
{ action: 'insert', obj: list, key: 2, value: 3, datatype: 'int' }, { action: 'splice', path: ['list', 2], values: [3] },
{ action: 'insert', obj: list, key: 2, value: map, datatype: 'map' }, { action: 'splice', path: ['list', 2], values: [{}] },
{ action: 'insert', obj: list, key: 2, value: list2, datatype: 'list' }, { action: 'splice', path: ['list', 2], values: [[]] },
]) ])
doc1.free() doc1.free()
}) })
@ -933,10 +918,8 @@ describe('Automerge', () => {
const list2 = doc1.pushObject(list, []) const list2 = doc1.pushObject(list, [])
assert.deepEqual(doc1.popPatches(), [ assert.deepEqual(doc1.popPatches(), [
{ action: 'put', obj: '_root', key: 'list', value: list, datatype: 'list', conflict: false }, { action: 'put', path: ['list'], value: [], conflict: false },
{ action: 'insert', obj: list, key: 0, value: 1, datatype: 'int' }, { action: 'splice', path: ['list',0], values: [1,{},[]] },
{ action: 'insert', obj: list, key: 1, value: map, datatype: 'map' },
{ action: 'insert', obj: list, key: 2, value: list2, datatype: 'list' },
]) ])
doc1.free() doc1.free()
}) })
@ -949,13 +932,10 @@ describe('Automerge', () => {
doc1.splice(list, 1, 2) doc1.splice(list, 1, 2)
assert.deepEqual(doc1.popPatches(), [ assert.deepEqual(doc1.popPatches(), [
{ action: 'put', obj: '_root', key: 'list', value: list, datatype: 'list', conflict: false }, { action: 'put', path: ['list'], value: [], conflict: false },
{ action: 'insert', obj: list, key: 0, value: 1, datatype: 'int' }, { action: 'splice', path: ['list',0], values: [1,2,3,4] },
{ action: 'insert', obj: list, key: 1, value: 2, datatype: 'int' }, { action: 'del', path: ['list',1] },
{ action: 'insert', obj: list, key: 2, value: 3, datatype: 'int' }, { action: 'del', path: ['list',1] },
{ action: 'insert', obj: list, key: 3, value: 4, datatype: 'int' },
{ action: 'delete', obj: list, key: 1 },
{ action: 'delete', obj: list, key: 1 },
]) ])
doc1.free() doc1.free()
}) })
@ -967,8 +947,8 @@ describe('Automerge', () => {
doc1.increment('_root', 'counter', 4) doc1.increment('_root', 'counter', 4)
assert.deepEqual(doc1.popPatches(), [ assert.deepEqual(doc1.popPatches(), [
{ action: 'put', obj: '_root', key: 'counter', value: 2, datatype: 'counter', conflict: false }, { action: 'put', path: ['counter'], value: 2, conflict: false },
{ action: 'increment', obj: '_root', key: 'counter', value: 4 }, { action: 'inc', path: ['counter'], value: 4 },
]) ])
doc1.free() doc1.free()
}) })
@ -982,10 +962,10 @@ describe('Automerge', () => {
doc1.delete('_root', 'key1') doc1.delete('_root', 'key1')
doc1.delete('_root', 'key2') doc1.delete('_root', 'key2')
assert.deepEqual(doc1.popPatches(), [ assert.deepEqual(doc1.popPatches(), [
{ action: 'put', obj: '_root', key: 'key1', value: 1, datatype: 'int', conflict: false }, { action: 'put', path: ['key1'], value: 1, conflict: false },
{ action: 'put', obj: '_root', key: 'key2', value: 2, datatype: 'int', conflict: false }, { action: 'put', path: ['key2'], value: 2, conflict: false },
{ action: 'delete', obj: '_root', key: 'key1' }, { action: 'del', path: ['key1'], },
{ action: 'delete', obj: '_root', key: 'key2' }, { action: 'del', path: ['key2'], },
]) ])
doc1.free() doc1.free()
}) })
@ -999,8 +979,8 @@ describe('Automerge', () => {
doc2.loadIncremental(doc1.saveIncremental()) doc2.loadIncremental(doc1.saveIncremental())
assert.deepEqual(doc2.getWithType('_root', 'starlings'), ['counter', 3]) assert.deepEqual(doc2.getWithType('_root', 'starlings'), ['counter', 3])
assert.deepEqual(doc2.popPatches(), [ assert.deepEqual(doc2.popPatches(), [
{ action: 'put', obj: '_root', key: 'starlings', value: 2, datatype: 'counter', conflict: false }, { action: 'put', path: ['starlings'], value: 2, conflict: false },
{ action: 'increment', obj: '_root', key: 'starlings', value: 1 } { action: 'inc', path: ['starlings'], value: 1 }
]) ])
doc1.free(); doc2.free() doc1.free(); doc2.free()
}) })
@ -1018,10 +998,10 @@ describe('Automerge', () => {
doc2.loadIncremental(doc1.saveIncremental()) doc2.loadIncremental(doc1.saveIncremental())
assert.deepEqual(doc2.popPatches(), [ assert.deepEqual(doc2.popPatches(), [
{ action: 'put', obj: '_root', key: 'list', value: list, datatype: 'list', conflict: false }, { action: 'put', path: ['list'], value: [], conflict: false },
{ action: 'insert', obj: list, key: 0, value: 1, datatype: 'counter' }, { action: 'splice', path: ['list',0], values: [1] },
{ action: 'increment', obj: list, key: 0, value: 2 }, { action: 'inc', path: ['list',0], value: 2 },
{ action: 'increment', obj: list, key: 0, value: -5 }, { action: 'inc', path: ['list',0], value: -5 },
]) ])
doc1.free(); doc2.free() doc1.free(); doc2.free()
}) })

View file

@ -6,7 +6,7 @@
"description": "typescript types for low level automerge api", "description": "typescript types for low level automerge api",
"homepage": "https://github.com/automerge/automerge-rs/tree/main/automerge-wasm", "homepage": "https://github.com/automerge/automerge-rs/tree/main/automerge-wasm",
"repository": "github:automerge/automerge-rs", "repository": "github:automerge/automerge-rs",
"version": "0.1.5", "version": "0.1.6",
"license": "MIT", "license": "MIT",
"files": [ "files": [
"LICENSE", "LICENSE",

View file

@ -1,49 +0,0 @@
export {
loadDoc as load,
create,
encodeChange,
decodeChange,
initSyncState,
encodeSyncMessage,
decodeSyncMessage,
encodeSyncState,
decodeSyncState,
exportSyncState,
importSyncState,
} from "./bindgen.js"
import {
loadDoc as load,
create,
encodeChange,
decodeChange,
initSyncState,
encodeSyncMessage,
decodeSyncMessage,
encodeSyncState,
decodeSyncState,
exportSyncState,
importSyncState,
} from "./bindgen.js"
let api = {
load,
create,
encodeChange,
decodeChange,
initSyncState,
encodeSyncMessage,
decodeSyncMessage,
encodeSyncState,
decodeSyncState,
exportSyncState,
importSyncState
}
import wasm_init from "./bindgen.js"
export function init() {
return new Promise((resolve,reject) => wasm_init().then(() => {
resolve({ ... api, load, create })
}))
}

View file

@ -9,19 +9,19 @@ use automerge::ROOT;
fn main() { fn main() {
let mut doc = Automerge::new(); let mut doc = Automerge::new();
let mut observer = VecOpObserver::default();
// a simple scalar change in the root object // a simple scalar change in the root object
doc.transact_with::<_, _, AutomergeError, _, _>( let mut result = doc
|_result| CommitOptions::default().with_op_observer(&mut observer), .transact_with::<_, _, AutomergeError, _, VecOpObserver>(
|tx| { |_result| CommitOptions::default(),
tx.put(ROOT, "hello", "world").unwrap(); |tx| {
Ok(()) tx.put(ROOT, "hello", "world").unwrap();
}, Ok(())
) },
.unwrap(); )
get_changes(&doc, observer.take_patches()); .unwrap();
get_changes(&doc, result.op_observer.take_patches());
let mut tx = doc.transaction(); let mut tx = doc.transaction_with_observer(VecOpObserver::default());
let map = tx let map = tx
.put_object(ROOT, "my new map", automerge::ObjType::Map) .put_object(ROOT, "my new map", automerge::ObjType::Map)
.unwrap(); .unwrap();
@ -36,28 +36,28 @@ fn main() {
tx.insert(&list, 1, "woo").unwrap(); tx.insert(&list, 1, "woo").unwrap();
let m = tx.insert_object(&list, 2, automerge::ObjType::Map).unwrap(); let m = tx.insert_object(&list, 2, automerge::ObjType::Map).unwrap();
tx.put(&m, "hi", 2).unwrap(); tx.put(&m, "hi", 2).unwrap();
let _heads3 = tx.commit_with(CommitOptions::default().with_op_observer(&mut observer)); let patches = tx.op_observer.take_patches();
get_changes(&doc, observer.take_patches()); let _heads3 = tx.commit_with(CommitOptions::default());
get_changes(&doc, patches);
} }
fn get_changes(doc: &Automerge, patches: Vec<Patch>) { fn get_changes(doc: &Automerge, patches: Vec<Patch>) {
for patch in patches { for patch in patches {
match patch { match patch {
Patch::Put { Patch::Put {
obj, obj, prop, value, ..
key,
value,
conflict: _,
} => { } => {
println!( println!(
"put {:?} at {:?} in obj {:?}, object path {:?}", "put {:?} at {:?} in obj {:?}, object path {:?}",
value, value,
key, prop,
obj, obj,
doc.path_to_object(&obj) doc.path_to_object(&obj)
) )
} }
Patch::Insert { obj, index, value } => { Patch::Insert {
obj, index, value, ..
} => {
println!( println!(
"insert {:?} at {:?} in obj {:?}, object path {:?}", "insert {:?} at {:?} in obj {:?}, object path {:?}",
value, value,
@ -66,18 +66,20 @@ fn get_changes(doc: &Automerge, patches: Vec<Patch>) {
doc.path_to_object(&obj) doc.path_to_object(&obj)
) )
} }
Patch::Increment { obj, key, value } => { Patch::Increment {
obj, prop, value, ..
} => {
println!( println!(
"increment {:?} in obj {:?} by {:?}, object path {:?}", "increment {:?} in obj {:?} by {:?}, object path {:?}",
key, prop,
obj, obj,
value, value,
doc.path_to_object(&obj) doc.path_to_object(&obj)
) )
} }
Patch::Delete { obj, key } => println!( Patch::Delete { obj, prop, .. } => println!(
"delete {:?} in obj {:?}, object path {:?}", "delete {:?} in obj {:?}, object path {:?}",
key, prop,
obj, obj,
doc.path_to_object(&obj) doc.path_to_object(&obj)
), ),

View file

@ -4,8 +4,7 @@ use crate::exid::ExId;
use crate::op_observer::OpObserver; use crate::op_observer::OpObserver;
use crate::transaction::{CommitOptions, Transactable}; use crate::transaction::{CommitOptions, Transactable};
use crate::{ use crate::{
sync, ApplyOptions, Keys, KeysAt, ListRange, ListRangeAt, MapRange, MapRangeAt, ObjType, sync, Keys, KeysAt, ListRange, ListRangeAt, MapRange, MapRangeAt, ObjType, Parents, ScalarValue,
Parents, ScalarValue,
}; };
use crate::{ use crate::{
transaction::TransactionInner, ActorId, Automerge, AutomergeError, Change, ChangeHash, Prop, transaction::TransactionInner, ActorId, Automerge, AutomergeError, Change, ChangeHash, Prop,
@ -14,22 +13,46 @@ use crate::{
/// An automerge document that automatically manages transactions. /// An automerge document that automatically manages transactions.
#[derive(Debug, Clone)] #[derive(Debug, Clone)]
pub struct AutoCommit { pub struct AutoCommitWithObs<Obs: OpObserver> {
doc: Automerge, doc: Automerge,
transaction: Option<TransactionInner>, transaction: Option<(Obs, TransactionInner)>,
op_observer: Obs,
} }
impl Default for AutoCommit { pub type AutoCommit = AutoCommitWithObs<()>;
impl<O: OpObserver> Default for AutoCommitWithObs<O> {
fn default() -> Self { fn default() -> Self {
Self::new() let op_observer = O::default();
AutoCommitWithObs {
doc: Automerge::new(),
transaction: None,
op_observer,
}
} }
} }
impl AutoCommit { impl AutoCommit {
pub fn new() -> Self { pub fn new() -> AutoCommit {
Self { AutoCommitWithObs {
doc: Automerge::new(), doc: Automerge::new(),
transaction: None, transaction: None,
op_observer: (),
}
}
}
impl<Obs: OpObserver> AutoCommitWithObs<Obs> {
pub fn observer(&mut self) -> &mut Obs {
self.ensure_transaction_closed();
&mut self.op_observer
}
pub fn with_observer<Obs2: OpObserver>(self, op_observer: Obs2) -> AutoCommitWithObs<Obs2> {
AutoCommitWithObs {
doc: self.doc,
transaction: self.transaction.map(|(_, t)| (op_observer.branch(), t)),
op_observer,
} }
} }
@ -58,7 +81,7 @@ impl AutoCommit {
fn ensure_transaction_open(&mut self) { fn ensure_transaction_open(&mut self) {
if self.transaction.is_none() { if self.transaction.is_none() {
self.transaction = Some(self.doc.transaction_inner()); self.transaction = Some((self.op_observer.branch(), self.doc.transaction_inner()));
} }
} }
@ -67,6 +90,7 @@ impl AutoCommit {
Self { Self {
doc: self.doc.fork(), doc: self.doc.fork(),
transaction: self.transaction.clone(), transaction: self.transaction.clone(),
op_observer: self.op_observer.clone(),
} }
} }
@ -75,46 +99,35 @@ impl AutoCommit {
Ok(Self { Ok(Self {
doc: self.doc.fork_at(heads)?, doc: self.doc.fork_at(heads)?,
transaction: self.transaction.clone(), transaction: self.transaction.clone(),
op_observer: self.op_observer.clone(),
}) })
} }
fn ensure_transaction_closed(&mut self) { fn ensure_transaction_closed(&mut self) {
if let Some(tx) = self.transaction.take() { if let Some((current, tx)) = self.transaction.take() {
tx.commit::<()>(&mut self.doc, None, None, None); self.op_observer.merge(&current);
tx.commit(&mut self.doc, None, None);
} }
} }
pub fn load(data: &[u8]) -> Result<Self, AutomergeError> { pub fn load(data: &[u8]) -> Result<Self, AutomergeError> {
// passing a () observer here has performance implications on all loads
// if we want an autocommit::load() method that can be observered we need to make a new method
// fn observed_load() ?
let doc = Automerge::load(data)?; let doc = Automerge::load(data)?;
let op_observer = Obs::default();
Ok(Self { Ok(Self {
doc, doc,
transaction: None, transaction: None,
}) op_observer,
}
pub fn load_with<Obs: OpObserver>(
data: &[u8],
options: ApplyOptions<'_, Obs>,
) -> Result<Self, AutomergeError> {
let doc = Automerge::load_with(data, options)?;
Ok(Self {
doc,
transaction: None,
}) })
} }
pub fn load_incremental(&mut self, data: &[u8]) -> Result<usize, AutomergeError> { pub fn load_incremental(&mut self, data: &[u8]) -> Result<usize, AutomergeError> {
self.ensure_transaction_closed(); self.ensure_transaction_closed();
self.doc.load_incremental(data) // TODO - would be nice to pass None here instead of &mut ()
} self.doc
.load_incremental_with(data, Some(&mut self.op_observer))
pub fn load_incremental_with<'a, Obs: OpObserver>(
&mut self,
data: &[u8],
options: ApplyOptions<'a, Obs>,
) -> Result<usize, AutomergeError> {
self.ensure_transaction_closed();
self.doc.load_incremental_with(data, options)
} }
pub fn apply_changes( pub fn apply_changes(
@ -122,34 +135,19 @@ impl AutoCommit {
changes: impl IntoIterator<Item = Change>, changes: impl IntoIterator<Item = Change>,
) -> Result<(), AutomergeError> { ) -> Result<(), AutomergeError> {
self.ensure_transaction_closed(); self.ensure_transaction_closed();
self.doc.apply_changes(changes) self.doc
} .apply_changes_with(changes, Some(&mut self.op_observer))
pub fn apply_changes_with<I: IntoIterator<Item = Change>, Obs: OpObserver>(
&mut self,
changes: I,
options: ApplyOptions<'_, Obs>,
) -> Result<(), AutomergeError> {
self.ensure_transaction_closed();
self.doc.apply_changes_with(changes, options)
} }
/// Takes all the changes in `other` which are not in `self` and applies them /// Takes all the changes in `other` which are not in `self` and applies them
pub fn merge(&mut self, other: &mut Self) -> Result<Vec<ChangeHash>, AutomergeError> { pub fn merge<Obs2: OpObserver>(
self.ensure_transaction_closed();
other.ensure_transaction_closed();
self.doc.merge(&mut other.doc)
}
/// Takes all the changes in `other` which are not in `self` and applies them
pub fn merge_with<'a, Obs: OpObserver>(
&mut self, &mut self,
other: &mut Self, other: &mut AutoCommitWithObs<Obs2>,
options: ApplyOptions<'a, Obs>,
) -> Result<Vec<ChangeHash>, AutomergeError> { ) -> Result<Vec<ChangeHash>, AutomergeError> {
self.ensure_transaction_closed(); self.ensure_transaction_closed();
other.ensure_transaction_closed(); other.ensure_transaction_closed();
self.doc.merge_with(&mut other.doc, options) self.doc
.merge_with(&mut other.doc, Some(&mut self.op_observer))
} }
pub fn save(&mut self) -> Vec<u8> { pub fn save(&mut self) -> Vec<u8> {
@ -220,17 +218,6 @@ impl AutoCommit {
self.doc.receive_sync_message(sync_state, message) self.doc.receive_sync_message(sync_state, message)
} }
pub fn receive_sync_message_with<'a, Obs: OpObserver>(
&mut self,
sync_state: &mut sync::State,
message: sync::Message,
options: ApplyOptions<'a, Obs>,
) -> Result<(), AutomergeError> {
self.ensure_transaction_closed();
self.doc
.receive_sync_message_with(sync_state, message, options)
}
/// Return a graphviz representation of the opset. /// Return a graphviz representation of the opset.
/// ///
/// # Arguments /// # Arguments
@ -251,7 +238,7 @@ impl AutoCommit {
} }
pub fn commit(&mut self) -> ChangeHash { pub fn commit(&mut self) -> ChangeHash {
self.commit_with::<()>(CommitOptions::default()) self.commit_with(CommitOptions::default())
} }
/// Commit the current operations with some options. /// Commit the current operations with some options.
@ -267,33 +254,29 @@ impl AutoCommit {
/// doc.put_object(&ROOT, "todos", ObjType::List).unwrap(); /// doc.put_object(&ROOT, "todos", ObjType::List).unwrap();
/// let now = SystemTime::now().duration_since(SystemTime::UNIX_EPOCH).unwrap().as_secs() as /// let now = SystemTime::now().duration_since(SystemTime::UNIX_EPOCH).unwrap().as_secs() as
/// i64; /// i64;
/// doc.commit_with::<()>(CommitOptions::default().with_message("Create todos list").with_time(now)); /// doc.commit_with(CommitOptions::default().with_message("Create todos list").with_time(now));
/// ``` /// ```
pub fn commit_with<Obs: OpObserver>(&mut self, options: CommitOptions<'_, Obs>) -> ChangeHash { pub fn commit_with(&mut self, options: CommitOptions) -> ChangeHash {
// ensure that even no changes triggers a change // ensure that even no changes triggers a change
self.ensure_transaction_open(); self.ensure_transaction_open();
let tx = self.transaction.take().unwrap(); let (current, tx) = self.transaction.take().unwrap();
tx.commit( self.op_observer.merge(&current);
&mut self.doc, tx.commit(&mut self.doc, options.message, options.time)
options.message,
options.time,
options.op_observer,
)
} }
pub fn rollback(&mut self) -> usize { pub fn rollback(&mut self) -> usize {
self.transaction self.transaction
.take() .take()
.map(|tx| tx.rollback(&mut self.doc)) .map(|(_, tx)| tx.rollback(&mut self.doc))
.unwrap_or(0) .unwrap_or(0)
} }
} }
impl Transactable for AutoCommit { impl<Obs: OpObserver> Transactable for AutoCommitWithObs<Obs> {
fn pending_ops(&self) -> usize { fn pending_ops(&self) -> usize {
self.transaction self.transaction
.as_ref() .as_ref()
.map(|t| t.pending_ops()) .map(|(_, t)| t.pending_ops())
.unwrap_or(0) .unwrap_or(0)
} }
@ -389,8 +372,8 @@ impl Transactable for AutoCommit {
value: V, value: V,
) -> Result<(), AutomergeError> { ) -> Result<(), AutomergeError> {
self.ensure_transaction_open(); self.ensure_transaction_open();
let tx = self.transaction.as_mut().unwrap(); let (current, tx) = self.transaction.as_mut().unwrap();
tx.put(&mut self.doc, obj.as_ref(), prop, value) tx.put(&mut self.doc, current, obj.as_ref(), prop, value)
} }
fn put_object<O: AsRef<ExId>, P: Into<Prop>>( fn put_object<O: AsRef<ExId>, P: Into<Prop>>(
@ -400,8 +383,8 @@ impl Transactable for AutoCommit {
value: ObjType, value: ObjType,
) -> Result<ExId, AutomergeError> { ) -> Result<ExId, AutomergeError> {
self.ensure_transaction_open(); self.ensure_transaction_open();
let tx = self.transaction.as_mut().unwrap(); let (current, tx) = self.transaction.as_mut().unwrap();
tx.put_object(&mut self.doc, obj.as_ref(), prop, value) tx.put_object(&mut self.doc, current, obj.as_ref(), prop, value)
} }
fn insert<O: AsRef<ExId>, V: Into<ScalarValue>>( fn insert<O: AsRef<ExId>, V: Into<ScalarValue>>(
@ -411,8 +394,8 @@ impl Transactable for AutoCommit {
value: V, value: V,
) -> Result<(), AutomergeError> { ) -> Result<(), AutomergeError> {
self.ensure_transaction_open(); self.ensure_transaction_open();
let tx = self.transaction.as_mut().unwrap(); let (current, tx) = self.transaction.as_mut().unwrap();
tx.insert(&mut self.doc, obj.as_ref(), index, value) tx.insert(&mut self.doc, current, obj.as_ref(), index, value)
} }
fn insert_object<O: AsRef<ExId>>( fn insert_object<O: AsRef<ExId>>(
@ -422,8 +405,8 @@ impl Transactable for AutoCommit {
value: ObjType, value: ObjType,
) -> Result<ExId, AutomergeError> { ) -> Result<ExId, AutomergeError> {
self.ensure_transaction_open(); self.ensure_transaction_open();
let tx = self.transaction.as_mut().unwrap(); let (current, tx) = self.transaction.as_mut().unwrap();
tx.insert_object(&mut self.doc, obj.as_ref(), index, value) tx.insert_object(&mut self.doc, current, obj.as_ref(), index, value)
} }
fn increment<O: AsRef<ExId>, P: Into<Prop>>( fn increment<O: AsRef<ExId>, P: Into<Prop>>(
@ -433,8 +416,8 @@ impl Transactable for AutoCommit {
value: i64, value: i64,
) -> Result<(), AutomergeError> { ) -> Result<(), AutomergeError> {
self.ensure_transaction_open(); self.ensure_transaction_open();
let tx = self.transaction.as_mut().unwrap(); let (current, tx) = self.transaction.as_mut().unwrap();
tx.increment(&mut self.doc, obj.as_ref(), prop, value) tx.increment(&mut self.doc, current, obj.as_ref(), prop, value)
} }
fn delete<O: AsRef<ExId>, P: Into<Prop>>( fn delete<O: AsRef<ExId>, P: Into<Prop>>(
@ -443,8 +426,8 @@ impl Transactable for AutoCommit {
prop: P, prop: P,
) -> Result<(), AutomergeError> { ) -> Result<(), AutomergeError> {
self.ensure_transaction_open(); self.ensure_transaction_open();
let tx = self.transaction.as_mut().unwrap(); let (current, tx) = self.transaction.as_mut().unwrap();
tx.delete(&mut self.doc, obj.as_ref(), prop) tx.delete(&mut self.doc, current, obj.as_ref(), prop)
} }
/// Splice new elements into the given sequence. Returns a vector of the OpIds used to insert /// Splice new elements into the given sequence. Returns a vector of the OpIds used to insert
@ -457,8 +440,8 @@ impl Transactable for AutoCommit {
vals: V, vals: V,
) -> Result<(), AutomergeError> { ) -> Result<(), AutomergeError> {
self.ensure_transaction_open(); self.ensure_transaction_open();
let tx = self.transaction.as_mut().unwrap(); let (current, tx) = self.transaction.as_mut().unwrap();
tx.splice(&mut self.doc, obj.as_ref(), pos, del, vals) tx.splice(&mut self.doc, current, obj.as_ref(), pos, del, vals)
} }
fn text<O: AsRef<ExId>>(&self, obj: O) -> Result<String, AutomergeError> { fn text<O: AsRef<ExId>>(&self, obj: O) -> Result<String, AutomergeError> {

View file

@ -19,8 +19,8 @@ use crate::types::{
ScalarValue, Value, ScalarValue, Value,
}; };
use crate::{ use crate::{
query, ApplyOptions, AutomergeError, Change, KeysAt, ListRange, ListRangeAt, MapRange, query, AutomergeError, Change, KeysAt, ListRange, ListRangeAt, MapRange, MapRangeAt, ObjType,
MapRangeAt, ObjType, Prop, Values, Prop, Values,
}; };
use serde::Serialize; use serde::Serialize;
@ -111,10 +111,22 @@ impl Automerge {
} }
/// Start a transaction. /// Start a transaction.
pub fn transaction(&mut self) -> Transaction<'_> { pub fn transaction(&mut self) -> Transaction<'_, ()> {
Transaction { Transaction {
inner: Some(self.transaction_inner()), inner: Some(self.transaction_inner()),
doc: self, doc: self,
op_observer: (),
}
}
pub fn transaction_with_observer<Obs: OpObserver>(
&mut self,
op_observer: Obs,
) -> Transaction<'_, Obs> {
Transaction {
inner: Some(self.transaction_inner()),
doc: self,
op_observer,
} }
} }
@ -143,15 +155,16 @@ impl Automerge {
/// Run a transaction on this document in a closure, automatically handling commit or rollback /// Run a transaction on this document in a closure, automatically handling commit or rollback
/// afterwards. /// afterwards.
pub fn transact<F, O, E>(&mut self, f: F) -> transaction::Result<O, E> pub fn transact<F, O, E>(&mut self, f: F) -> transaction::Result<O, (), E>
where where
F: FnOnce(&mut Transaction<'_>) -> Result<O, E>, F: FnOnce(&mut Transaction<'_, ()>) -> Result<O, E>,
{ {
let mut tx = self.transaction(); let mut tx = self.transaction();
let result = f(&mut tx); let result = f(&mut tx);
match result { match result {
Ok(result) => Ok(Success { Ok(result) => Ok(Success {
result, result,
op_observer: (),
hash: tx.commit(), hash: tx.commit(),
}), }),
Err(error) => Err(Failure { Err(error) => Err(Failure {
@ -162,19 +175,25 @@ impl Automerge {
} }
/// Like [`Self::transact`] but with a function for generating the commit options. /// Like [`Self::transact`] but with a function for generating the commit options.
pub fn transact_with<'a, F, O, E, C, Obs>(&mut self, c: C, f: F) -> transaction::Result<O, E> pub fn transact_with<F, O, E, C, Obs>(&mut self, c: C, f: F) -> transaction::Result<O, Obs, E>
where where
F: FnOnce(&mut Transaction<'_>) -> Result<O, E>, F: FnOnce(&mut Transaction<'_, Obs>) -> Result<O, E>,
C: FnOnce(&O) -> CommitOptions<'a, Obs>, C: FnOnce(&O) -> CommitOptions,
Obs: 'a + OpObserver, Obs: OpObserver,
{ {
let mut tx = self.transaction(); let mut op_observer = Obs::default();
let mut tx = self.transaction_with_observer(Default::default());
let result = f(&mut tx); let result = f(&mut tx);
match result { match result {
Ok(result) => { Ok(result) => {
let commit_options = c(&result); let commit_options = c(&result);
std::mem::swap(&mut op_observer, &mut tx.op_observer);
let hash = tx.commit_with(commit_options); let hash = tx.commit_with(commit_options);
Ok(Success { result, hash }) Ok(Success {
result,
hash,
op_observer,
})
} }
Err(error) => Err(Failure { Err(error) => Err(Failure {
error, error,
@ -220,17 +239,6 @@ impl Automerge {
// PropAt::() // PropAt::()
// NthAt::() // NthAt::()
/// Get the object id of the object that contains this object and the prop that this object is
/// at in that object.
pub(crate) fn parent_object(&self, obj: ObjId) -> Option<(ObjId, Key)> {
if obj == ObjId::root() {
// root has no parent
None
} else {
self.ops.parent_object(&obj)
}
}
/// Get the parents of an object in the document tree. /// Get the parents of an object in the document tree.
/// ///
/// ### Errors /// ### Errors
@ -244,10 +252,7 @@ impl Automerge {
/// value. /// value.
pub fn parents<O: AsRef<ExId>>(&self, obj: O) -> Result<Parents<'_>, AutomergeError> { pub fn parents<O: AsRef<ExId>>(&self, obj: O) -> Result<Parents<'_>, AutomergeError> {
let obj_id = self.exid_to_obj(obj.as_ref())?; let obj_id = self.exid_to_obj(obj.as_ref())?;
Ok(Parents { Ok(self.ops.parents(obj_id))
obj: obj_id,
doc: self,
})
} }
pub fn path_to_object<O: AsRef<ExId>>( pub fn path_to_object<O: AsRef<ExId>>(
@ -259,21 +264,6 @@ impl Automerge {
Ok(path) Ok(path)
} }
/// Export a key to a prop.
pub(crate) fn export_key(&self, obj: ObjId, key: Key) -> Prop {
match key {
Key::Map(m) => Prop::Map(self.ops.m.props.get(m).into()),
Key::Seq(opid) => {
let i = self
.ops
.search(&obj, query::ElemIdPos::new(opid))
.index()
.unwrap();
Prop::Seq(i)
}
}
}
/// Get the keys of the object `obj`. /// Get the keys of the object `obj`.
/// ///
/// For a map this returns the keys of the map. /// For a map this returns the keys of the map.
@ -587,14 +577,14 @@ impl Automerge {
/// Load a document. /// Load a document.
pub fn load(data: &[u8]) -> Result<Self, AutomergeError> { pub fn load(data: &[u8]) -> Result<Self, AutomergeError> {
Self::load_with::<()>(data, ApplyOptions::default()) Self::load_with::<()>(data, None)
} }
/// Load a document. /// Load a document.
#[tracing::instrument(skip(data, options), err)] #[tracing::instrument(skip(data, observer), err)]
pub fn load_with<Obs: OpObserver>( pub fn load_with<Obs: OpObserver>(
data: &[u8], data: &[u8],
mut options: ApplyOptions<'_, Obs>, mut observer: Option<&mut Obs>,
) -> Result<Self, AutomergeError> { ) -> Result<Self, AutomergeError> {
if data.is_empty() { if data.is_empty() {
tracing::trace!("no data, initializing empty document"); tracing::trace!("no data, initializing empty document");
@ -606,7 +596,6 @@ impl Automerge {
if !first_chunk.checksum_valid() { if !first_chunk.checksum_valid() {
return Err(load::Error::BadChecksum.into()); return Err(load::Error::BadChecksum.into());
} }
let observer = &mut options.op_observer;
let mut am = match first_chunk { let mut am = match first_chunk {
storage::Chunk::Document(d) => { storage::Chunk::Document(d) => {
@ -616,7 +605,7 @@ impl Automerge {
result: op_set, result: op_set,
changes, changes,
heads, heads,
} = match observer { } = match &mut observer {
Some(o) => storage::load::reconstruct_document(&d, OpSet::observed_builder(*o)), Some(o) => storage::load::reconstruct_document(&d, OpSet::observed_builder(*o)),
None => storage::load::reconstruct_document(&d, OpSet::builder()), None => storage::load::reconstruct_document(&d, OpSet::builder()),
} }
@ -651,7 +640,7 @@ impl Automerge {
let change = Change::new_from_unverified(stored_change.into_owned(), None) let change = Change::new_from_unverified(stored_change.into_owned(), None)
.map_err(|e| load::Error::InvalidChangeColumns(Box::new(e)))?; .map_err(|e| load::Error::InvalidChangeColumns(Box::new(e)))?;
let mut am = Self::new(); let mut am = Self::new();
am.apply_change(change, observer); am.apply_change(change, &mut observer);
am am
} }
storage::Chunk::CompressedChange(stored_change, compressed) => { storage::Chunk::CompressedChange(stored_change, compressed) => {
@ -662,7 +651,7 @@ impl Automerge {
) )
.map_err(|e| load::Error::InvalidChangeColumns(Box::new(e)))?; .map_err(|e| load::Error::InvalidChangeColumns(Box::new(e)))?;
let mut am = Self::new(); let mut am = Self::new();
am.apply_change(change, observer); am.apply_change(change, &mut observer);
am am
} }
}; };
@ -670,7 +659,7 @@ impl Automerge {
match load::load_changes(remaining.reset()) { match load::load_changes(remaining.reset()) {
load::LoadedChanges::Complete(c) => { load::LoadedChanges::Complete(c) => {
for change in c { for change in c {
am.apply_change(change, observer); am.apply_change(change, &mut observer);
} }
} }
load::LoadedChanges::Partial { error, .. } => return Err(error.into()), load::LoadedChanges::Partial { error, .. } => return Err(error.into()),
@ -680,14 +669,14 @@ impl Automerge {
/// Load an incremental save of a document. /// Load an incremental save of a document.
pub fn load_incremental(&mut self, data: &[u8]) -> Result<usize, AutomergeError> { pub fn load_incremental(&mut self, data: &[u8]) -> Result<usize, AutomergeError> {
self.load_incremental_with::<()>(data, ApplyOptions::default()) self.load_incremental_with::<()>(data, None)
} }
/// Load an incremental save of a document. /// Load an incremental save of a document.
pub fn load_incremental_with<Obs: OpObserver>( pub fn load_incremental_with<Obs: OpObserver>(
&mut self, &mut self,
data: &[u8], data: &[u8],
options: ApplyOptions<'_, Obs>, op_observer: Option<&mut Obs>,
) -> Result<usize, AutomergeError> { ) -> Result<usize, AutomergeError> {
let changes = match load::load_changes(storage::parse::Input::new(data)) { let changes = match load::load_changes(storage::parse::Input::new(data)) {
load::LoadedChanges::Complete(c) => c, load::LoadedChanges::Complete(c) => c,
@ -697,7 +686,7 @@ impl Automerge {
} }
}; };
let start = self.ops.len(); let start = self.ops.len();
self.apply_changes_with(changes, options)?; self.apply_changes_with(changes, op_observer)?;
let delta = self.ops.len() - start; let delta = self.ops.len() - start;
Ok(delta) Ok(delta)
} }
@ -717,14 +706,14 @@ impl Automerge {
&mut self, &mut self,
changes: impl IntoIterator<Item = Change>, changes: impl IntoIterator<Item = Change>,
) -> Result<(), AutomergeError> { ) -> Result<(), AutomergeError> {
self.apply_changes_with::<_, ()>(changes, ApplyOptions::default()) self.apply_changes_with::<_, ()>(changes, None)
} }
/// Apply changes to this document. /// Apply changes to this document.
pub fn apply_changes_with<I: IntoIterator<Item = Change>, Obs: OpObserver>( pub fn apply_changes_with<I: IntoIterator<Item = Change>, Obs: OpObserver>(
&mut self, &mut self,
changes: I, changes: I,
mut options: ApplyOptions<'_, Obs>, mut op_observer: Option<&mut Obs>,
) -> Result<(), AutomergeError> { ) -> Result<(), AutomergeError> {
for c in changes { for c in changes {
if !self.history_index.contains_key(&c.hash()) { if !self.history_index.contains_key(&c.hash()) {
@ -735,7 +724,7 @@ impl Automerge {
)); ));
} }
if self.is_causally_ready(&c) { if self.is_causally_ready(&c) {
self.apply_change(c, &mut options.op_observer); self.apply_change(c, &mut op_observer);
} else { } else {
self.queue.push(c); self.queue.push(c);
} }
@ -743,7 +732,7 @@ impl Automerge {
} }
while let Some(c) = self.pop_next_causally_ready_change() { while let Some(c) = self.pop_next_causally_ready_change() {
if !self.history_index.contains_key(&c.hash()) { if !self.history_index.contains_key(&c.hash()) {
self.apply_change(c, &mut options.op_observer); self.apply_change(c, &mut op_observer);
} }
} }
Ok(()) Ok(())
@ -831,14 +820,14 @@ impl Automerge {
/// Takes all the changes in `other` which are not in `self` and applies them /// Takes all the changes in `other` which are not in `self` and applies them
pub fn merge(&mut self, other: &mut Self) -> Result<Vec<ChangeHash>, AutomergeError> { pub fn merge(&mut self, other: &mut Self) -> Result<Vec<ChangeHash>, AutomergeError> {
self.merge_with::<()>(other, ApplyOptions::default()) self.merge_with::<()>(other, None)
} }
/// Takes all the changes in `other` which are not in `self` and applies them /// Takes all the changes in `other` which are not in `self` and applies them
pub fn merge_with<'a, Obs: OpObserver>( pub fn merge_with<Obs: OpObserver>(
&mut self, &mut self,
other: &mut Self, other: &mut Self,
options: ApplyOptions<'a, Obs>, op_observer: Option<&mut Obs>,
) -> Result<Vec<ChangeHash>, AutomergeError> { ) -> Result<Vec<ChangeHash>, AutomergeError> {
// TODO: Make this fallible and figure out how to do this transactionally // TODO: Make this fallible and figure out how to do this transactionally
let changes = self let changes = self
@ -847,7 +836,7 @@ impl Automerge {
.cloned() .cloned()
.collect::<Vec<_>>(); .collect::<Vec<_>>();
tracing::trace!(changes=?changes.iter().map(|c| c.hash()).collect::<Vec<_>>(), "merging new changes"); tracing::trace!(changes=?changes.iter().map(|c| c.hash()).collect::<Vec<_>>(), "merging new changes");
self.apply_changes_with(changes, options)?; self.apply_changes_with(changes, op_observer)?;
Ok(self.get_heads()) Ok(self.get_heads())
} }

View file

@ -1437,19 +1437,15 @@ fn observe_counter_change_application_overwrite() {
doc1.increment(ROOT, "counter", 5).unwrap(); doc1.increment(ROOT, "counter", 5).unwrap();
doc1.commit(); doc1.commit();
let mut observer = VecOpObserver::default(); let mut doc3 = doc1.fork().with_observer(VecOpObserver::default());
let mut doc3 = doc1.clone(); doc3.merge(&mut doc2).unwrap();
doc3.merge_with(
&mut doc2,
ApplyOptions::default().with_op_observer(&mut observer),
)
.unwrap();
assert_eq!( assert_eq!(
observer.take_patches(), doc3.observer().take_patches(),
vec![Patch::Put { vec![Patch::Put {
obj: ExId::Root, obj: ExId::Root,
key: Prop::Map("counter".into()), path: vec![],
prop: Prop::Map("counter".into()),
value: ( value: (
ScalarValue::Str("mystring".into()).into(), ScalarValue::Str("mystring".into()).into(),
ExId::Id(2, doc2.get_actor().clone(), 1) ExId::Id(2, doc2.get_actor().clone(), 1)
@ -1458,16 +1454,11 @@ fn observe_counter_change_application_overwrite() {
}] }]
); );
let mut observer = VecOpObserver::default(); let mut doc4 = doc2.clone().with_observer(VecOpObserver::default());
let mut doc4 = doc2.clone(); doc4.merge(&mut doc1).unwrap();
doc4.merge_with(
&mut doc1,
ApplyOptions::default().with_op_observer(&mut observer),
)
.unwrap();
// no patches as the increments operate on an invisible counter // no patches as the increments operate on an invisible counter
assert_eq!(observer.take_patches(), vec![]); assert_eq!(doc4.observer().take_patches(), vec![]);
} }
#[test] #[test]
@ -1478,20 +1469,15 @@ fn observe_counter_change_application() {
doc.increment(ROOT, "counter", 5).unwrap(); doc.increment(ROOT, "counter", 5).unwrap();
let changes = doc.get_changes(&[]).unwrap().into_iter().cloned(); let changes = doc.get_changes(&[]).unwrap().into_iter().cloned();
let mut new_doc = AutoCommit::new(); let mut new_doc = AutoCommit::new().with_observer(VecOpObserver::default());
let mut observer = VecOpObserver::default(); new_doc.apply_changes(changes).unwrap();
new_doc
.apply_changes_with(
changes,
ApplyOptions::default().with_op_observer(&mut observer),
)
.unwrap();
assert_eq!( assert_eq!(
observer.take_patches(), new_doc.observer().take_patches(),
vec![ vec![
Patch::Put { Patch::Put {
obj: ExId::Root, obj: ExId::Root,
key: Prop::Map("counter".into()), path: vec![],
prop: Prop::Map("counter".into()),
value: ( value: (
ScalarValue::counter(1).into(), ScalarValue::counter(1).into(),
ExId::Id(1, doc.get_actor().clone(), 0) ExId::Id(1, doc.get_actor().clone(), 0)
@ -1500,12 +1486,14 @@ fn observe_counter_change_application() {
}, },
Patch::Increment { Patch::Increment {
obj: ExId::Root, obj: ExId::Root,
key: Prop::Map("counter".into()), path: vec![],
prop: Prop::Map("counter".into()),
value: (2, ExId::Id(2, doc.get_actor().clone(), 0)), value: (2, ExId::Id(2, doc.get_actor().clone(), 0)),
}, },
Patch::Increment { Patch::Increment {
obj: ExId::Root, obj: ExId::Root,
key: Prop::Map("counter".into()), path: vec![],
prop: Prop::Map("counter".into()),
value: (5, ExId::Id(3, doc.get_actor().clone(), 0)), value: (5, ExId::Id(3, doc.get_actor().clone(), 0)),
} }
] ]
@ -1514,7 +1502,7 @@ fn observe_counter_change_application() {
#[test] #[test]
fn get_changes_heads_empty() { fn get_changes_heads_empty() {
let mut doc = AutoCommit::new(); let mut doc = AutoCommit::default();
doc.put(ROOT, "key1", 1).unwrap(); doc.put(ROOT, "key1", 1).unwrap();
doc.commit(); doc.commit();
doc.put(ROOT, "key2", 1).unwrap(); doc.put(ROOT, "key2", 1).unwrap();

View file

@ -75,7 +75,6 @@ mod map_range_at;
mod op_observer; mod op_observer;
mod op_set; mod op_set;
mod op_tree; mod op_tree;
mod options;
mod parents; mod parents;
mod query; mod query;
mod storage; mod storage;
@ -88,7 +87,7 @@ mod values;
mod visualisation; mod visualisation;
pub use crate::automerge::Automerge; pub use crate::automerge::Automerge;
pub use autocommit::AutoCommit; pub use autocommit::{AutoCommit, AutoCommitWithObs};
pub use autoserde::AutoSerde; pub use autoserde::AutoSerde;
pub use change::{Change, LoadError as LoadChangeError}; pub use change::{Change, LoadError as LoadChangeError};
pub use error::AutomergeError; pub use error::AutomergeError;
@ -105,7 +104,6 @@ pub use map_range_at::MapRangeAt;
pub use op_observer::OpObserver; pub use op_observer::OpObserver;
pub use op_observer::Patch; pub use op_observer::Patch;
pub use op_observer::VecOpObserver; pub use op_observer::VecOpObserver;
pub use options::ApplyOptions;
pub use parents::Parents; pub use parents::Parents;
pub use types::{ActorId, ChangeHash, ObjType, OpType, Prop}; pub use types::{ActorId, ChangeHash, ObjType, OpType, Prop};
pub use value::{ScalarValue, Value}; pub use value::{ScalarValue, Value};

View file

@ -1,50 +1,105 @@
use crate::exid::ExId; use crate::exid::ExId;
use crate::Parents;
use crate::Prop; use crate::Prop;
use crate::Value; use crate::Value;
/// An observer of operations applied to the document. /// An observer of operations applied to the document.
pub trait OpObserver { pub trait OpObserver: Default + Clone {
/// A new value has been inserted into the given object. /// A new value has been inserted into the given object.
/// ///
/// - `objid`: the object that has been inserted into. /// - `objid`: the object that has been inserted into.
/// - `index`: the index the new value has been inserted at. /// - `index`: the index the new value has been inserted at.
/// - `tagged_value`: the value that has been inserted and the id of the operation that did the /// - `tagged_value`: the value that has been inserted and the id of the operation that did the
/// insert. /// insert.
fn insert(&mut self, objid: ExId, index: usize, tagged_value: (Value<'_>, ExId)); fn insert(
&mut self,
parents: Parents<'_>,
objid: ExId,
index: usize,
tagged_value: (Value<'_>, ExId),
);
/// A new value has been put into the given object. /// A new value has been put into the given object.
/// ///
/// - `objid`: the object that has been put into. /// - `objid`: the object that has been put into.
/// - `key`: the key that the value as been put at. /// - `prop`: the prop that the value as been put at.
/// - `tagged_value`: the value that has been put into the object and the id of the operation /// - `tagged_value`: the value that has been put into the object and the id of the operation
/// that did the put. /// that did the put.
/// - `conflict`: whether this put conflicts with other operations. /// - `conflict`: whether this put conflicts with other operations.
fn put(&mut self, objid: ExId, key: Prop, tagged_value: (Value<'_>, ExId), conflict: bool); fn put(
&mut self,
parents: Parents<'_>,
objid: ExId,
prop: Prop,
tagged_value: (Value<'_>, ExId),
conflict: bool,
);
/// A counter has been incremented. /// A counter has been incremented.
/// ///
/// - `objid`: the object that contains the counter. /// - `objid`: the object that contains the counter.
/// - `key`: they key that the chounter is at. /// - `prop`: they prop that the chounter is at.
/// - `tagged_value`: the amount the counter has been incremented by, and the the id of the /// - `tagged_value`: the amount the counter has been incremented by, and the the id of the
/// increment operation. /// increment operation.
fn increment(&mut self, objid: ExId, key: Prop, tagged_value: (i64, ExId)); fn increment(
&mut self,
parents: Parents<'_>,
objid: ExId,
prop: Prop,
tagged_value: (i64, ExId),
);
/// A value has beeen deleted. /// A value has beeen deleted.
/// ///
/// - `objid`: the object that has been deleted in. /// - `objid`: the object that has been deleted in.
/// - `key`: the key of the value that has been deleted. /// - `prop`: the prop of the value that has been deleted.
fn delete(&mut self, objid: ExId, key: Prop); fn delete(&mut self, parents: Parents<'_>, objid: ExId, prop: Prop);
/// Merge data with an other observer
///
/// - `other`: Another Op Observer of the same type
fn merge(&mut self, other: &Self);
/// Branch off to begin a transaction - allows state information to be coppied if needed
///
/// - `other`: Another Op Observer of the same type
fn branch(&self) -> Self {
Self::default()
}
} }
impl OpObserver for () { impl OpObserver for () {
fn insert(&mut self, _objid: ExId, _index: usize, _tagged_value: (Value<'_>, ExId)) {} fn insert(
&mut self,
fn put(&mut self, _objid: ExId, _key: Prop, _tagged_value: (Value<'_>, ExId), _conflict: bool) { _parents: Parents<'_>,
_objid: ExId,
_index: usize,
_tagged_value: (Value<'_>, ExId),
) {
} }
fn increment(&mut self, _objid: ExId, _key: Prop, _tagged_value: (i64, ExId)) {} fn put(
&mut self,
_parents: Parents<'_>,
_objid: ExId,
_prop: Prop,
_tagged_value: (Value<'_>, ExId),
_conflict: bool,
) {
}
fn delete(&mut self, _objid: ExId, _key: Prop) {} fn increment(
&mut self,
_parents: Parents<'_>,
_objid: ExId,
_prop: Prop,
_tagged_value: (i64, ExId),
) {
}
fn delete(&mut self, _parents: Parents<'_>, _objid: ExId, _prop: Prop) {}
fn merge(&mut self, _other: &Self) {}
} }
/// Capture operations into a [`Vec`] and store them as patches. /// Capture operations into a [`Vec`] and store them as patches.
@ -62,45 +117,77 @@ impl VecOpObserver {
} }
impl OpObserver for VecOpObserver { impl OpObserver for VecOpObserver {
fn insert(&mut self, obj_id: ExId, index: usize, (value, id): (Value<'_>, ExId)) { fn insert(
&mut self,
mut parents: Parents<'_>,
obj: ExId,
index: usize,
(value, id): (Value<'_>, ExId),
) {
let path = parents.path();
self.patches.push(Patch::Insert { self.patches.push(Patch::Insert {
obj: obj_id, obj,
path,
index, index,
value: (value.into_owned(), id), value: (value.into_owned(), id),
}); });
} }
fn put(&mut self, objid: ExId, key: Prop, (value, id): (Value<'_>, ExId), conflict: bool) { fn put(
&mut self,
mut parents: Parents<'_>,
obj: ExId,
prop: Prop,
(value, id): (Value<'_>, ExId),
conflict: bool,
) {
let path = parents.path();
self.patches.push(Patch::Put { self.patches.push(Patch::Put {
obj: objid, obj,
key, path,
prop,
value: (value.into_owned(), id), value: (value.into_owned(), id),
conflict, conflict,
}); });
} }
fn increment(&mut self, objid: ExId, key: Prop, tagged_value: (i64, ExId)) { fn increment(
&mut self,
mut parents: Parents<'_>,
obj: ExId,
prop: Prop,
tagged_value: (i64, ExId),
) {
let path = parents.path();
self.patches.push(Patch::Increment { self.patches.push(Patch::Increment {
obj: objid, obj,
key, path,
prop,
value: tagged_value, value: tagged_value,
}); });
} }
fn delete(&mut self, objid: ExId, key: Prop) { fn delete(&mut self, mut parents: Parents<'_>, obj: ExId, prop: Prop) {
self.patches.push(Patch::Delete { obj: objid, key }) let path = parents.path();
self.patches.push(Patch::Delete { obj, path, prop })
}
fn merge(&mut self, other: &Self) {
self.patches.extend_from_slice(other.patches.as_slice())
} }
} }
/// A notification to the application that something has changed in a document. /// A notification to the application that something has changed in a document.
#[derive(Debug, Clone, PartialEq)] #[derive(Debug, Clone, PartialEq)]
pub enum Patch { pub enum Patch {
/// Associating a new value with a key in a map, or an existing list element /// Associating a new value with a prop in a map, or an existing list element
Put { Put {
/// path to the object
path: Vec<(ExId, Prop)>,
/// The object that was put into. /// The object that was put into.
obj: ExId, obj: ExId,
/// The key that the new value was put at. /// The prop that the new value was put at.
key: Prop, prop: Prop,
/// The value that was put, and the id of the operation that put it there. /// The value that was put, and the id of the operation that put it there.
value: (Value<'static>, ExId), value: (Value<'static>, ExId),
/// Whether this put conflicts with another. /// Whether this put conflicts with another.
@ -108,6 +195,8 @@ pub enum Patch {
}, },
/// Inserting a new element into a list/text /// Inserting a new element into a list/text
Insert { Insert {
/// path to the object
path: Vec<(ExId, Prop)>,
/// The object that was inserted into. /// The object that was inserted into.
obj: ExId, obj: ExId,
/// The index that the new value was inserted at. /// The index that the new value was inserted at.
@ -117,19 +206,23 @@ pub enum Patch {
}, },
/// Incrementing a counter. /// Incrementing a counter.
Increment { Increment {
/// path to the object
path: Vec<(ExId, Prop)>,
/// The object that was incremented in. /// The object that was incremented in.
obj: ExId, obj: ExId,
/// The key that was incremented. /// The prop that was incremented.
key: Prop, prop: Prop,
/// The amount that the counter was incremented by, and the id of the operation that /// The amount that the counter was incremented by, and the id of the operation that
/// did the increment. /// did the increment.
value: (i64, ExId), value: (i64, ExId),
}, },
/// Deleting an element from a list/text /// Deleting an element from a list/text
Delete { Delete {
/// path to the object
path: Vec<(ExId, Prop)>,
/// The object that was deleted from. /// The object that was deleted from.
obj: ExId, obj: ExId,
/// The key that was deleted. /// The prop that was deleted.
key: Prop, prop: Prop,
}, },
} }

View file

@ -2,8 +2,9 @@ use crate::clock::Clock;
use crate::exid::ExId; use crate::exid::ExId;
use crate::indexed_cache::IndexedCache; use crate::indexed_cache::IndexedCache;
use crate::op_tree::{self, OpTree}; use crate::op_tree::{self, OpTree};
use crate::parents::Parents;
use crate::query::{self, OpIdSearch, TreeQuery}; use crate::query::{self, OpIdSearch, TreeQuery};
use crate::types::{self, ActorId, Key, ObjId, Op, OpId, OpIds, OpType}; use crate::types::{self, ActorId, Key, ObjId, Op, OpId, OpIds, OpType, Prop};
use crate::{ObjType, OpObserver}; use crate::{ObjType, OpObserver};
use fxhash::FxBuildHasher; use fxhash::FxBuildHasher;
use std::borrow::Borrow; use std::borrow::Borrow;
@ -68,12 +69,29 @@ impl OpSetInternal {
} }
} }
pub(crate) fn parents(&self, obj: ObjId) -> Parents<'_> {
Parents { obj, ops: self }
}
pub(crate) fn parent_object(&self, obj: &ObjId) -> Option<(ObjId, Key)> { pub(crate) fn parent_object(&self, obj: &ObjId) -> Option<(ObjId, Key)> {
let parent = self.trees.get(obj)?.parent?; let parent = self.trees.get(obj)?.parent?;
let key = self.search(&parent, OpIdSearch::new(obj.0)).key().unwrap(); let key = self.search(&parent, OpIdSearch::new(obj.0)).key().unwrap();
Some((parent, key)) Some((parent, key))
} }
pub(crate) fn export_key(&self, obj: ObjId, key: Key) -> Prop {
match key {
Key::Map(m) => Prop::Map(self.m.props.get(m).into()),
Key::Seq(opid) => {
let i = self
.search(&obj, query::ElemIdPos::new(opid))
.index()
.unwrap();
Prop::Seq(i)
}
}
}
pub(crate) fn keys(&self, obj: ObjId) -> Option<query::Keys<'_>> { pub(crate) fn keys(&self, obj: ObjId) -> Option<query::Keys<'_>> {
if let Some(tree) = self.trees.get(&obj) { if let Some(tree) = self.trees.get(&obj) {
tree.internal.keys() tree.internal.keys()
@ -245,6 +263,8 @@ impl OpSetInternal {
} = q; } = q;
let ex_obj = self.id_to_exid(obj.0); let ex_obj = self.id_to_exid(obj.0);
let parents = self.parents(*obj);
let key = match op.key { let key = match op.key {
Key::Map(index) => self.m.props[index].clone().into(), Key::Map(index) => self.m.props[index].clone().into(),
Key::Seq(_) => seen.into(), Key::Seq(_) => seen.into(),
@ -252,21 +272,21 @@ impl OpSetInternal {
if op.insert { if op.insert {
let value = (op.value(), self.id_to_exid(op.id)); let value = (op.value(), self.id_to_exid(op.id));
observer.insert(ex_obj, seen, value); observer.insert(parents, ex_obj, seen, value);
} else if op.is_delete() { } else if op.is_delete() {
if let Some(winner) = &values.last() { if let Some(winner) = &values.last() {
let value = (winner.value(), self.id_to_exid(winner.id)); let value = (winner.value(), self.id_to_exid(winner.id));
let conflict = values.len() > 1; let conflict = values.len() > 1;
observer.put(ex_obj, key, value, conflict); observer.put(parents, ex_obj, key, value, conflict);
} else { } else {
observer.delete(ex_obj, key); observer.delete(parents, ex_obj, key);
} }
} else if let Some(value) = op.get_increment_value() { } else if let Some(value) = op.get_increment_value() {
// only observe this increment if the counter is visible, i.e. the counter's // only observe this increment if the counter is visible, i.e. the counter's
// create op is in the values // create op is in the values
if values.iter().any(|value| op.pred.contains(&value.id)) { if values.iter().any(|value| op.pred.contains(&value.id)) {
// we have observed the value // we have observed the value
observer.increment(ex_obj, key, (value, self.id_to_exid(op.id))); observer.increment(parents, ex_obj, key, (value, self.id_to_exid(op.id)));
} }
} else { } else {
let winner = if let Some(last_value) = values.last() { let winner = if let Some(last_value) = values.last() {
@ -280,10 +300,10 @@ impl OpSetInternal {
}; };
let value = (winner.value(), self.id_to_exid(winner.id)); let value = (winner.value(), self.id_to_exid(winner.id));
if op.is_list_op() && !had_value_before { if op.is_list_op() && !had_value_before {
observer.insert(ex_obj, seen, value); observer.insert(parents, ex_obj, seen, value);
} else { } else {
let conflict = !values.is_empty(); let conflict = !values.is_empty();
observer.put(ex_obj, key, value, conflict); observer.put(parents, ex_obj, key, value, conflict);
} }
} }

View file

@ -1,16 +0,0 @@
#[derive(Debug, Default)]
pub struct ApplyOptions<'a, Obs> {
pub op_observer: Option<&'a mut Obs>,
}
impl<'a, Obs> ApplyOptions<'a, Obs> {
pub fn with_op_observer(mut self, op_observer: &'a mut Obs) -> Self {
self.op_observer = Some(op_observer);
self
}
pub fn set_op_observer(&mut self, op_observer: &'a mut Obs) -> &mut Self {
self.op_observer = Some(op_observer);
self
}
}

View file

@ -1,18 +1,33 @@
use crate::{exid::ExId, types::ObjId, Automerge, Prop}; use crate::op_set::OpSet;
use crate::types::ObjId;
use crate::{exid::ExId, Prop};
#[derive(Debug)] #[derive(Debug)]
pub struct Parents<'a> { pub struct Parents<'a> {
pub(crate) obj: ObjId, pub(crate) obj: ObjId,
pub(crate) doc: &'a Automerge, pub(crate) ops: &'a OpSet,
}
impl<'a> Parents<'a> {
pub fn path(&mut self) -> Vec<(ExId, Prop)> {
let mut path = self.collect::<Vec<_>>();
path.reverse();
path
}
} }
impl<'a> Iterator for Parents<'a> { impl<'a> Iterator for Parents<'a> {
type Item = (ExId, Prop); type Item = (ExId, Prop);
fn next(&mut self) -> Option<Self::Item> { fn next(&mut self) -> Option<Self::Item> {
if let Some((obj, key)) = self.doc.parent_object(self.obj) { if self.obj.is_root() {
None
} else if let Some((obj, key)) = self.ops.parent_object(&self.obj) {
self.obj = obj; self.obj = obj;
Some((self.doc.id_to_exid(obj.0), self.doc.export_key(obj, key))) Some((
self.ops.id_to_exid(self.obj.0),
self.ops.export_key(self.obj, key),
))
} else { } else {
None None
} }

View file

@ -3,7 +3,7 @@ use std::collections::{HashMap, HashSet};
use crate::{ use crate::{
storage::{parse, Change as StoredChange, ReadChangeOpError}, storage::{parse, Change as StoredChange, ReadChangeOpError},
ApplyOptions, Automerge, AutomergeError, Change, ChangeHash, OpObserver, Automerge, AutomergeError, Change, ChangeHash, OpObserver,
}; };
mod bloom; mod bloom;
@ -104,14 +104,14 @@ impl Automerge {
sync_state: &mut State, sync_state: &mut State,
message: Message, message: Message,
) -> Result<(), AutomergeError> { ) -> Result<(), AutomergeError> {
self.receive_sync_message_with::<()>(sync_state, message, ApplyOptions::default()) self.receive_sync_message_with::<()>(sync_state, message, None)
} }
pub fn receive_sync_message_with<'a, Obs: OpObserver>( pub fn receive_sync_message_with<Obs: OpObserver>(
&mut self, &mut self,
sync_state: &mut State, sync_state: &mut State,
message: Message, message: Message,
options: ApplyOptions<'a, Obs>, op_observer: Option<&mut Obs>,
) -> Result<(), AutomergeError> { ) -> Result<(), AutomergeError> {
let before_heads = self.get_heads(); let before_heads = self.get_heads();
@ -124,7 +124,7 @@ impl Automerge {
let changes_is_empty = message_changes.is_empty(); let changes_is_empty = message_changes.is_empty();
if !changes_is_empty { if !changes_is_empty {
self.apply_changes_with(message_changes, options)?; self.apply_changes_with(message_changes, op_observer)?;
sync_state.shared_heads = advance_heads( sync_state.shared_heads = advance_heads(
&before_heads.iter().collect(), &before_heads.iter().collect(),
&self.get_heads().into_iter().collect(), &self.get_heads().into_iter().collect(),

View file

@ -11,4 +11,4 @@ pub use manual_transaction::Transaction;
pub use result::Failure; pub use result::Failure;
pub use result::Success; pub use result::Success;
pub type Result<O, E> = std::result::Result<Success<O>, Failure<E>>; pub type Result<O, Obs, E> = std::result::Result<Success<O, Obs>, Failure<E>>;

View file

@ -1,12 +1,11 @@
/// Optional metadata for a commit. /// Optional metadata for a commit.
#[derive(Debug, Default)] #[derive(Debug, Default)]
pub struct CommitOptions<'a, Obs> { pub struct CommitOptions {
pub message: Option<String>, pub message: Option<String>,
pub time: Option<i64>, pub time: Option<i64>,
pub op_observer: Option<&'a mut Obs>,
} }
impl<'a, Obs> CommitOptions<'a, Obs> { impl CommitOptions {
/// Add a message to the commit. /// Add a message to the commit.
pub fn with_message<S: Into<String>>(mut self, message: S) -> Self { pub fn with_message<S: Into<String>>(mut self, message: S) -> Self {
self.message = Some(message.into()); self.message = Some(message.into());
@ -30,14 +29,4 @@ impl<'a, Obs> CommitOptions<'a, Obs> {
self.time = Some(time); self.time = Some(time);
self self
} }
pub fn with_op_observer(mut self, op_observer: &'a mut Obs) -> Self {
self.op_observer = Some(op_observer);
self
}
pub fn set_op_observer(&mut self, op_observer: &'a mut Obs) -> &mut Self {
self.op_observer = Some(op_observer);
self
}
} }

View file

@ -26,13 +26,12 @@ impl TransactionInner {
/// Commit the operations performed in this transaction, returning the hashes corresponding to /// Commit the operations performed in this transaction, returning the hashes corresponding to
/// the new heads. /// the new heads.
#[tracing::instrument(skip(self, doc, op_observer))] #[tracing::instrument(skip(self, doc))]
pub(crate) fn commit<Obs: OpObserver>( pub(crate) fn commit(
mut self, mut self,
doc: &mut Automerge, doc: &mut Automerge,
message: Option<String>, message: Option<String>,
time: Option<i64>, time: Option<i64>,
op_observer: Option<&mut Obs>,
) -> ChangeHash { ) -> ChangeHash {
if message.is_some() { if message.is_some() {
self.message = message; self.message = message;
@ -42,25 +41,27 @@ impl TransactionInner {
self.time = t; self.time = t;
} }
if let Some(observer) = op_observer { /*
for (obj, prop, op) in &self.operations { if let Some(observer) = op_observer {
let ex_obj = doc.ops.id_to_exid(obj.0); for (obj, prop, op) in &self.operations {
if op.insert { let ex_obj = doc.ops.id_to_exid(obj.0);
let value = (op.value(), doc.id_to_exid(op.id)); if op.insert {
match prop { let value = (op.value(), doc.id_to_exid(op.id));
Prop::Map(_) => panic!("insert into a map"), match prop {
Prop::Seq(index) => observer.insert(ex_obj, *index, value), Prop::Map(_) => panic!("insert into a map"),
Prop::Seq(index) => observer.insert(ex_obj, *index, value),
}
} else if op.is_delete() {
observer.delete(ex_obj, prop.clone());
} else if let Some(value) = op.get_increment_value() {
observer.increment(ex_obj, prop.clone(), (value, doc.id_to_exid(op.id)));
} else {
let value = (op.value(), doc.ops.id_to_exid(op.id));
observer.put(ex_obj, prop.clone(), value, false);
}
} }
} else if op.is_delete() {
observer.delete(ex_obj, prop.clone());
} else if let Some(value) = op.get_increment_value() {
observer.increment(ex_obj, prop.clone(), (value, doc.id_to_exid(op.id)));
} else {
let value = (op.value(), doc.ops.id_to_exid(op.id));
observer.put(ex_obj, prop.clone(), value, false);
} }
} */
}
let num_ops = self.pending_ops(); let num_ops = self.pending_ops();
let change = self.export(&doc.ops.m); let change = self.export(&doc.ops.m);
@ -150,9 +151,10 @@ impl TransactionInner {
/// - The object does not exist /// - The object does not exist
/// - The key is the wrong type for the object /// - The key is the wrong type for the object
/// - The key does not exist in the object /// - The key does not exist in the object
pub(crate) fn put<P: Into<Prop>, V: Into<ScalarValue>>( pub(crate) fn put<P: Into<Prop>, V: Into<ScalarValue>, Obs: OpObserver>(
&mut self, &mut self,
doc: &mut Automerge, doc: &mut Automerge,
op_observer: &mut Obs,
ex_obj: &ExId, ex_obj: &ExId,
prop: P, prop: P,
value: V, value: V,
@ -160,7 +162,7 @@ impl TransactionInner {
let obj = doc.exid_to_obj(ex_obj)?; let obj = doc.exid_to_obj(ex_obj)?;
let value = value.into(); let value = value.into();
let prop = prop.into(); let prop = prop.into();
self.local_op(doc, obj, prop, value.into())?; self.local_op(doc, op_observer, obj, prop, value.into())?;
Ok(()) Ok(())
} }
@ -177,16 +179,19 @@ impl TransactionInner {
/// - The object does not exist /// - The object does not exist
/// - The key is the wrong type for the object /// - The key is the wrong type for the object
/// - The key does not exist in the object /// - The key does not exist in the object
pub(crate) fn put_object<P: Into<Prop>>( pub(crate) fn put_object<P: Into<Prop>, Obs: OpObserver>(
&mut self, &mut self,
doc: &mut Automerge, doc: &mut Automerge,
op_observer: &mut Obs,
ex_obj: &ExId, ex_obj: &ExId,
prop: P, prop: P,
value: ObjType, value: ObjType,
) -> Result<ExId, AutomergeError> { ) -> Result<ExId, AutomergeError> {
let obj = doc.exid_to_obj(ex_obj)?; let obj = doc.exid_to_obj(ex_obj)?;
let prop = prop.into(); let prop = prop.into();
let id = self.local_op(doc, obj, prop, value.into())?.unwrap(); let id = self
.local_op(doc, op_observer, obj, prop, value.into())?
.unwrap();
let id = doc.id_to_exid(id); let id = doc.id_to_exid(id);
Ok(id) Ok(id)
} }
@ -195,9 +200,11 @@ impl TransactionInner {
OpId(self.start_op.get() + self.pending_ops() as u64, self.actor) OpId(self.start_op.get() + self.pending_ops() as u64, self.actor)
} }
fn insert_local_op( #[allow(clippy::too_many_arguments)]
fn insert_local_op<Obs: OpObserver>(
&mut self, &mut self,
doc: &mut Automerge, doc: &mut Automerge,
op_observer: &mut Obs,
prop: Prop, prop: Prop,
op: Op, op: Op,
pos: usize, pos: usize,
@ -210,12 +217,13 @@ impl TransactionInner {
doc.ops.insert(pos, &obj, op.clone()); doc.ops.insert(pos, &obj, op.clone());
} }
self.operations.push((obj, prop, op)); self.finalize_op(doc, op_observer, obj, prop, op);
} }
pub(crate) fn insert<V: Into<ScalarValue>>( pub(crate) fn insert<V: Into<ScalarValue>, Obs: OpObserver>(
&mut self, &mut self,
doc: &mut Automerge, doc: &mut Automerge,
op_observer: &mut Obs,
ex_obj: &ExId, ex_obj: &ExId,
index: usize, index: usize,
value: V, value: V,
@ -223,26 +231,28 @@ impl TransactionInner {
let obj = doc.exid_to_obj(ex_obj)?; let obj = doc.exid_to_obj(ex_obj)?;
let value = value.into(); let value = value.into();
tracing::trace!(obj=?obj, value=?value, "inserting value"); tracing::trace!(obj=?obj, value=?value, "inserting value");
self.do_insert(doc, obj, index, value.into())?; self.do_insert(doc, op_observer, obj, index, value.into())?;
Ok(()) Ok(())
} }
pub(crate) fn insert_object( pub(crate) fn insert_object<Obs: OpObserver>(
&mut self, &mut self,
doc: &mut Automerge, doc: &mut Automerge,
op_observer: &mut Obs,
ex_obj: &ExId, ex_obj: &ExId,
index: usize, index: usize,
value: ObjType, value: ObjType,
) -> Result<ExId, AutomergeError> { ) -> Result<ExId, AutomergeError> {
let obj = doc.exid_to_obj(ex_obj)?; let obj = doc.exid_to_obj(ex_obj)?;
let id = self.do_insert(doc, obj, index, value.into())?; let id = self.do_insert(doc, op_observer, obj, index, value.into())?;
let id = doc.id_to_exid(id); let id = doc.id_to_exid(id);
Ok(id) Ok(id)
} }
fn do_insert( fn do_insert<Obs: OpObserver>(
&mut self, &mut self,
doc: &mut Automerge, doc: &mut Automerge,
op_observer: &mut Obs,
obj: ObjId, obj: ObjId,
index: usize, index: usize,
action: OpType, action: OpType,
@ -263,27 +273,30 @@ impl TransactionInner {
}; };
doc.ops.insert(query.pos(), &obj, op.clone()); doc.ops.insert(query.pos(), &obj, op.clone());
self.operations.push((obj, Prop::Seq(index), op));
self.finalize_op(doc, op_observer, obj, Prop::Seq(index), op);
Ok(id) Ok(id)
} }
pub(crate) fn local_op( pub(crate) fn local_op<Obs: OpObserver>(
&mut self, &mut self,
doc: &mut Automerge, doc: &mut Automerge,
op_observer: &mut Obs,
obj: ObjId, obj: ObjId,
prop: Prop, prop: Prop,
action: OpType, action: OpType,
) -> Result<Option<OpId>, AutomergeError> { ) -> Result<Option<OpId>, AutomergeError> {
match prop { match prop {
Prop::Map(s) => self.local_map_op(doc, obj, s, action), Prop::Map(s) => self.local_map_op(doc, op_observer, obj, s, action),
Prop::Seq(n) => self.local_list_op(doc, obj, n, action), Prop::Seq(n) => self.local_list_op(doc, op_observer, obj, n, action),
} }
} }
fn local_map_op( fn local_map_op<Obs: OpObserver>(
&mut self, &mut self,
doc: &mut Automerge, doc: &mut Automerge,
op_observer: &mut Obs,
obj: ObjId, obj: ObjId,
prop: String, prop: String,
action: OpType, action: OpType,
@ -324,14 +337,15 @@ impl TransactionInner {
let pos = query.pos; let pos = query.pos;
let ops_pos = query.ops_pos; let ops_pos = query.ops_pos;
self.insert_local_op(doc, Prop::Map(prop), op, pos, obj, &ops_pos); self.insert_local_op(doc, op_observer, Prop::Map(prop), op, pos, obj, &ops_pos);
Ok(Some(id)) Ok(Some(id))
} }
fn local_list_op( fn local_list_op<Obs: OpObserver>(
&mut self, &mut self,
doc: &mut Automerge, doc: &mut Automerge,
op_observer: &mut Obs,
obj: ObjId, obj: ObjId,
index: usize, index: usize,
action: OpType, action: OpType,
@ -363,40 +377,43 @@ impl TransactionInner {
let pos = query.pos; let pos = query.pos;
let ops_pos = query.ops_pos; let ops_pos = query.ops_pos;
self.insert_local_op(doc, Prop::Seq(index), op, pos, obj, &ops_pos); self.insert_local_op(doc, op_observer, Prop::Seq(index), op, pos, obj, &ops_pos);
Ok(Some(id)) Ok(Some(id))
} }
pub(crate) fn increment<P: Into<Prop>>( pub(crate) fn increment<P: Into<Prop>, Obs: OpObserver>(
&mut self, &mut self,
doc: &mut Automerge, doc: &mut Automerge,
op_observer: &mut Obs,
obj: &ExId, obj: &ExId,
prop: P, prop: P,
value: i64, value: i64,
) -> Result<(), AutomergeError> { ) -> Result<(), AutomergeError> {
let obj = doc.exid_to_obj(obj)?; let obj = doc.exid_to_obj(obj)?;
self.local_op(doc, obj, prop.into(), OpType::Increment(value))?; self.local_op(doc, op_observer, obj, prop.into(), OpType::Increment(value))?;
Ok(()) Ok(())
} }
pub(crate) fn delete<P: Into<Prop>>( pub(crate) fn delete<P: Into<Prop>, Obs: OpObserver>(
&mut self, &mut self,
doc: &mut Automerge, doc: &mut Automerge,
op_observer: &mut Obs,
ex_obj: &ExId, ex_obj: &ExId,
prop: P, prop: P,
) -> Result<(), AutomergeError> { ) -> Result<(), AutomergeError> {
let obj = doc.exid_to_obj(ex_obj)?; let obj = doc.exid_to_obj(ex_obj)?;
let prop = prop.into(); let prop = prop.into();
self.local_op(doc, obj, prop, OpType::Delete)?; self.local_op(doc, op_observer, obj, prop, OpType::Delete)?;
Ok(()) Ok(())
} }
/// Splice new elements into the given sequence. Returns a vector of the OpIds used to insert /// Splice new elements into the given sequence. Returns a vector of the OpIds used to insert
/// the new elements /// the new elements
pub(crate) fn splice( pub(crate) fn splice<Obs: OpObserver>(
&mut self, &mut self,
doc: &mut Automerge, doc: &mut Automerge,
op_observer: &mut Obs,
ex_obj: &ExId, ex_obj: &ExId,
mut pos: usize, mut pos: usize,
del: usize, del: usize,
@ -405,15 +422,48 @@ impl TransactionInner {
let obj = doc.exid_to_obj(ex_obj)?; let obj = doc.exid_to_obj(ex_obj)?;
for _ in 0..del { for _ in 0..del {
// del() // del()
self.local_op(doc, obj, pos.into(), OpType::Delete)?; self.local_op(doc, op_observer, obj, pos.into(), OpType::Delete)?;
} }
for v in vals { for v in vals {
// insert() // insert()
self.do_insert(doc, obj, pos, v.clone().into())?; self.do_insert(doc, op_observer, obj, pos, v.clone().into())?;
pos += 1; pos += 1;
} }
Ok(()) Ok(())
} }
fn finalize_op<Obs: OpObserver>(
&mut self,
doc: &mut Automerge,
op_observer: &mut Obs,
obj: ObjId,
prop: Prop,
op: Op,
) {
// TODO - id_to_exid should be a noop if not used - change type to Into<ExId>?
let ex_obj = doc.ops.id_to_exid(obj.0);
let parents = doc.ops.parents(obj);
if op.insert {
let value = (op.value(), doc.ops.id_to_exid(op.id));
match prop {
Prop::Map(_) => panic!("insert into a map"),
Prop::Seq(index) => op_observer.insert(parents, ex_obj, index, value),
}
} else if op.is_delete() {
op_observer.delete(parents, ex_obj, prop.clone());
} else if let Some(value) = op.get_increment_value() {
op_observer.increment(
parents,
ex_obj,
prop.clone(),
(value, doc.ops.id_to_exid(op.id)),
);
} else {
let value = (op.value(), doc.ops.id_to_exid(op.id));
op_observer.put(parents, ex_obj, prop.clone(), value, false);
}
self.operations.push((obj, prop, op));
}
} }
#[cfg(test)] #[cfg(test)]

View file

@ -20,14 +20,15 @@ use super::{CommitOptions, Transactable, TransactionInner};
/// intermediate state. /// intermediate state.
/// This is consistent with `?` error handling. /// This is consistent with `?` error handling.
#[derive(Debug)] #[derive(Debug)]
pub struct Transaction<'a> { pub struct Transaction<'a, Obs: OpObserver> {
// this is an option so that we can take it during commit and rollback to prevent it being // this is an option so that we can take it during commit and rollback to prevent it being
// rolled back during drop. // rolled back during drop.
pub(crate) inner: Option<TransactionInner>, pub(crate) inner: Option<TransactionInner>,
pub(crate) doc: &'a mut Automerge, pub(crate) doc: &'a mut Automerge,
pub op_observer: Obs,
} }
impl<'a> Transaction<'a> { impl<'a, Obs: OpObserver> Transaction<'a, Obs> {
/// Get the heads of the document before this transaction was started. /// Get the heads of the document before this transaction was started.
pub fn get_heads(&self) -> Vec<ChangeHash> { pub fn get_heads(&self) -> Vec<ChangeHash> {
self.doc.get_heads() self.doc.get_heads()
@ -36,10 +37,7 @@ impl<'a> Transaction<'a> {
/// Commit the operations performed in this transaction, returning the hashes corresponding to /// Commit the operations performed in this transaction, returning the hashes corresponding to
/// the new heads. /// the new heads.
pub fn commit(mut self) -> ChangeHash { pub fn commit(mut self) -> ChangeHash {
self.inner self.inner.take().unwrap().commit(self.doc, None, None)
.take()
.unwrap()
.commit::<()>(self.doc, None, None, None)
} }
/// Commit the operations in this transaction with some options. /// Commit the operations in this transaction with some options.
@ -56,15 +54,13 @@ impl<'a> Transaction<'a> {
/// tx.put_object(ROOT, "todos", ObjType::List).unwrap(); /// tx.put_object(ROOT, "todos", ObjType::List).unwrap();
/// let now = SystemTime::now().duration_since(SystemTime::UNIX_EPOCH).unwrap().as_secs() as /// let now = SystemTime::now().duration_since(SystemTime::UNIX_EPOCH).unwrap().as_secs() as
/// i64; /// i64;
/// tx.commit_with::<()>(CommitOptions::default().with_message("Create todos list").with_time(now)); /// tx.commit_with(CommitOptions::default().with_message("Create todos list").with_time(now));
/// ``` /// ```
pub fn commit_with<Obs: OpObserver>(mut self, options: CommitOptions<'_, Obs>) -> ChangeHash { pub fn commit_with(mut self, options: CommitOptions) -> ChangeHash {
self.inner.take().unwrap().commit( self.inner
self.doc, .take()
options.message, .unwrap()
options.time, .commit(self.doc, options.message, options.time)
options.op_observer,
)
} }
/// Undo the operations added in this transaction, returning the number of cancelled /// Undo the operations added in this transaction, returning the number of cancelled
@ -74,7 +70,7 @@ impl<'a> Transaction<'a> {
} }
} }
impl<'a> Transactable for Transaction<'a> { impl<'a, Obs: OpObserver> Transactable for Transaction<'a, Obs> {
/// Get the number of pending operations in this transaction. /// Get the number of pending operations in this transaction.
fn pending_ops(&self) -> usize { fn pending_ops(&self) -> usize {
self.inner.as_ref().unwrap().pending_ops() self.inner.as_ref().unwrap().pending_ops()
@ -97,7 +93,7 @@ impl<'a> Transactable for Transaction<'a> {
self.inner self.inner
.as_mut() .as_mut()
.unwrap() .unwrap()
.put(self.doc, obj.as_ref(), prop, value) .put(self.doc, &mut self.op_observer, obj.as_ref(), prop, value)
} }
fn put_object<O: AsRef<ExId>, P: Into<Prop>>( fn put_object<O: AsRef<ExId>, P: Into<Prop>>(
@ -106,10 +102,13 @@ impl<'a> Transactable for Transaction<'a> {
prop: P, prop: P,
value: ObjType, value: ObjType,
) -> Result<ExId, AutomergeError> { ) -> Result<ExId, AutomergeError> {
self.inner self.inner.as_mut().unwrap().put_object(
.as_mut() self.doc,
.unwrap() &mut self.op_observer,
.put_object(self.doc, obj.as_ref(), prop, value) obj.as_ref(),
prop,
value,
)
} }
fn insert<O: AsRef<ExId>, V: Into<ScalarValue>>( fn insert<O: AsRef<ExId>, V: Into<ScalarValue>>(
@ -118,10 +117,13 @@ impl<'a> Transactable for Transaction<'a> {
index: usize, index: usize,
value: V, value: V,
) -> Result<(), AutomergeError> { ) -> Result<(), AutomergeError> {
self.inner self.inner.as_mut().unwrap().insert(
.as_mut() self.doc,
.unwrap() &mut self.op_observer,
.insert(self.doc, obj.as_ref(), index, value) obj.as_ref(),
index,
value,
)
} }
fn insert_object<O: AsRef<ExId>>( fn insert_object<O: AsRef<ExId>>(
@ -130,10 +132,13 @@ impl<'a> Transactable for Transaction<'a> {
index: usize, index: usize,
value: ObjType, value: ObjType,
) -> Result<ExId, AutomergeError> { ) -> Result<ExId, AutomergeError> {
self.inner self.inner.as_mut().unwrap().insert_object(
.as_mut() self.doc,
.unwrap() &mut self.op_observer,
.insert_object(self.doc, obj.as_ref(), index, value) obj.as_ref(),
index,
value,
)
} }
fn increment<O: AsRef<ExId>, P: Into<Prop>>( fn increment<O: AsRef<ExId>, P: Into<Prop>>(
@ -142,10 +147,13 @@ impl<'a> Transactable for Transaction<'a> {
prop: P, prop: P,
value: i64, value: i64,
) -> Result<(), AutomergeError> { ) -> Result<(), AutomergeError> {
self.inner self.inner.as_mut().unwrap().increment(
.as_mut() self.doc,
.unwrap() &mut self.op_observer,
.increment(self.doc, obj.as_ref(), prop, value) obj.as_ref(),
prop,
value,
)
} }
fn delete<O: AsRef<ExId>, P: Into<Prop>>( fn delete<O: AsRef<ExId>, P: Into<Prop>>(
@ -156,7 +164,7 @@ impl<'a> Transactable for Transaction<'a> {
self.inner self.inner
.as_mut() .as_mut()
.unwrap() .unwrap()
.delete(self.doc, obj.as_ref(), prop) .delete(self.doc, &mut self.op_observer, obj.as_ref(), prop)
} }
/// Splice new elements into the given sequence. Returns a vector of the OpIds used to insert /// Splice new elements into the given sequence. Returns a vector of the OpIds used to insert
@ -168,10 +176,14 @@ impl<'a> Transactable for Transaction<'a> {
del: usize, del: usize,
vals: V, vals: V,
) -> Result<(), AutomergeError> { ) -> Result<(), AutomergeError> {
self.inner self.inner.as_mut().unwrap().splice(
.as_mut() self.doc,
.unwrap() &mut self.op_observer,
.splice(self.doc, obj.as_ref(), pos, del, vals) obj.as_ref(),
pos,
del,
vals,
)
} }
fn keys<O: AsRef<ExId>>(&self, obj: O) -> Keys<'_, '_> { fn keys<O: AsRef<ExId>>(&self, obj: O) -> Keys<'_, '_> {
@ -291,7 +303,7 @@ impl<'a> Transactable for Transaction<'a> {
// intermediate state. // intermediate state.
// This defaults to rolling back the transaction to be compatible with `?` error returning before // This defaults to rolling back the transaction to be compatible with `?` error returning before
// reaching a call to `commit`. // reaching a call to `commit`.
impl<'a> Drop for Transaction<'a> { impl<'a, Obs: OpObserver> Drop for Transaction<'a, Obs> {
fn drop(&mut self) { fn drop(&mut self) {
if let Some(txn) = self.inner.take() { if let Some(txn) = self.inner.take() {
txn.rollback(self.doc); txn.rollback(self.doc);

View file

@ -2,11 +2,12 @@ use crate::ChangeHash;
/// The result of a successful, and committed, transaction. /// The result of a successful, and committed, transaction.
#[derive(Debug)] #[derive(Debug)]
pub struct Success<O> { pub struct Success<O, Obs> {
/// The result of the transaction. /// The result of the transaction.
pub result: O, pub result: O,
/// The hash of the change, also the head of the document. /// The hash of the change, also the head of the document.
pub hash: ChangeHash, pub hash: ChangeHash,
pub op_observer: Obs,
} }
/// The result of a failed, and rolled back, transaction. /// The result of a failed, and rolled back, transaction.

View file

@ -1,7 +1,7 @@
use automerge::transaction::Transactable; use automerge::transaction::Transactable;
use automerge::{ use automerge::{
ActorId, ApplyOptions, AutoCommit, Automerge, AutomergeError, Change, ExpandedChange, ObjType, ActorId, AutoCommit, Automerge, AutomergeError, Change, ExpandedChange, ObjType, ScalarValue,
ScalarValue, VecOpObserver, ROOT, VecOpObserver, ROOT,
}; };
// set up logging for all the tests // set up logging for all the tests
@ -1005,13 +1005,8 @@ fn observe_counter_change_application() {
doc.increment(ROOT, "counter", 5).unwrap(); doc.increment(ROOT, "counter", 5).unwrap();
let changes = doc.get_changes(&[]).unwrap().into_iter().cloned(); let changes = doc.get_changes(&[]).unwrap().into_iter().cloned();
let mut doc = AutoCommit::new(); let mut doc = AutoCommit::new().with_observer(VecOpObserver::default());
let mut observer = VecOpObserver::default(); doc.apply_changes(changes).unwrap();
doc.apply_changes_with(
changes,
ApplyOptions::default().with_op_observer(&mut observer),
)
.unwrap();
} }
#[test] #[test]

View file

@ -3,18 +3,11 @@ set -e
THIS_SCRIPT=$(dirname "$0"); THIS_SCRIPT=$(dirname "$0");
WASM_PROJECT=$THIS_SCRIPT/../../automerge-wasm; WASM_PROJECT=$THIS_SCRIPT/../../automerge-wasm;
JS_PROJECT=$THIS_SCRIPT/../../automerge-js; JS_PROJECT=$THIS_SCRIPT/../../automerge-js;
E2E_PROJECT=$THIS_SCRIPT/../../automerge-js/e2e;
yarn --cwd $WASM_PROJECT install; yarn --cwd $E2E_PROJECT install;
# This will take care of running wasm-pack # This will build the automerge-wasm project, publish it to a local NPM
yarn --cwd $WASM_PROJECT build; # repository, then run `yarn build` in the `automerge-js` directory with
# If the dependencies are already installed we delete automerge-wasm. This makes # the local registry
# this script usable for iterative development. yarn --cwd $E2E_PROJECT e2e buildjs;
if [ -d $JS_PROJECT/node_modules/automerge-wasm ]; then yarn --cwd $JS_PROJECT test
rm -rf $JS_PROJECT/node_modules/automerge-wasm
fi
# --check-files forces yarn to check if the local dep has changed
yarn --cwd $JS_PROJECT install --check-files;
yarn --cwd $JS_PROJECT test;

View file

@ -1,6 +1,9 @@
THIS_SCRIPT=$(dirname "$0"); THIS_SCRIPT=$(dirname "$0");
E2E_PROJECT=$THIS_SCRIPT/../../automerge-js/e2e;
WASM_PROJECT=$THIS_SCRIPT/../../automerge-wasm; WASM_PROJECT=$THIS_SCRIPT/../../automerge-wasm;
yarn --cwd $WASM_PROJECT install; # This takes care of publishing the correct version of automerge-types in
yarn --cwd $WASM_PROJECT build; # a local NPM registry and calling yarn install with that registry available
yarn --cwd $E2E_PROJECT install
yarn --cwd $E2E_PROJECT e2e buildwasm
yarn --cwd $WASM_PROJECT test; yarn --cwd $WASM_PROJECT test;