Generated the 4tb field and deleted it
would be too expensive to copy the damn thing out of cloud
generating it locally as well to make sure the hash comes out the same
but not rush, it would be a protocol change to enable it anyways
4tb,lol
i for one eagerly wait for 16TB fields
I might need to shift the generation algorithm for that
this one is pretty brutal
took 14 days to generate the 4tb file on a GCE instance
but it should be linear with size
is it possible to set up the daemon and wallet on a raspberry pi?
yes
bazel might not work well (or at all)
so easier to copy jar files onto a pi
but should work just fine
might have to use lobstack rather than rocksdb
not sure if the rocksdb library has arm binary
what's bazel?
The build system we use
Made by Google, works pretty well
Plus I know how to use it, mostly
where can i get the jar files? are they on github?
The release zips have them
Or you can build them on a regular Linux machine
i want the least complicated route
Releases from website