Org-roam on an e-ink paper tablet

I recently again stumbled across paper tablets with e-ink display like the reMarkable 2.
The reMarkable runs Linux and apparently has quite a large FOSS community.
I would love to have such a device that lives between having to have a laptop all the time, and an inferior org experience on my phone.
Pine64 is currently also developing the PineNote, a 10" Linux e-ink tablet.

Has anyone thought about that or knows about a project that supports that?

If there is interest in the community this could also be a nice project to start.
Some thoughts on what would be necessary to make this work.

  • Installing GUI emacs. this should work as it’s a Linux tablet. Don’t know if it runs X
  • decent stylus/pen integration in emacs with a context menu for navigation
    (on-screen keyboard might be sufficient for navigation.)
  • Touch gestures mapping. For example actions in org-agenda or indenting headlines.
  • Take notes with the pen
    • store it as a picture and separately create links to other notes.
    • handwriting to text conversion
    • integrate with the default/another drawing handwriting app/interface?

I think the actual note taking part would be the most difficult and kinda forces emacs to divert from its keyboard-centric philosophy.

I’d like to hear other peoples thoughts and ideas.

3 Likes

I got a development version a while ago, but I haven’t had time to work on it. X works fine so far and keyboard-based handling of emacs should fine as well, but developing a touch-based / pen-based interface is going to be a whole different story. I plan on using it mostly for reading and annotating PDFs (perhaps using xournalpp or a modified version of KOReader), and synchronising those so they’re linked in my bib files. I’m not sure where to go from there, but I will see how it goes and report back once I’ve got around to it.

1 Like

Emacs 27.2 is fairly usable under both X and wayland on the pinenote, provided one uses a keyboard. Pen events and touch events are both translated into mouse events. The starting point would probably to make all relevant information from the xinput2 events available in lisp, starting with the originating device, so we can distinguish pen input from touch input. This can be tested without a touchscreen or a pinenote, as any second input device (e.g. a USB drawing tablet, a second mouse) should do.

I’m considering integrating some of these things into pdf-tools, as my main applications are reading and annotating PDF files (mostly text-based highlighting, perhaps some scribbling at the side that is not implemented yet). At the same time a stand-alone application directly using libinput (more events, including proximity events) and poppler would mean less work and result in better rendering because of more fine-grained control over the waveform / rendering surface, which would give noticeable improvements in terms of latency and rendering quality when visualising the pen input.

While not exactly the same, something along some of this lines that caught my attention was “Prose: the distraction-free, e-ink laptop that should exist”:

How much I would love something like this… :slight_smile: