Machine learning, LiveView powered Desktop Applications

There are multiple ways to interact with your software and one of the most common ones is Desktop applications. It’s a very tricky area but a recent project has made it a lot easier - Tauri. In this post we’re going to build a Liveview applications that uses Bumblebee and a new proof of concept library to install Tauri in your Phoenix project called ex_tauri

In the beggining, there was the Web App

All the code is available at Grammarlocal repository

First we create a new application which I will be creating Grammarlocal the usual way. mix phx.new grammarlocal --no-gettext --no-mailer --no-dashboard --no-gettext --no-ecto

Then we build a really basic UI using Liveview Simple UI with a basic text area

defmodule GrammarlocalWeb.InputLive do
  use GrammarlocalWeb, :live_view

  def mount(_params, _session, socket) do
    socket =
      socket
      |> assign(:output, "")
      |> assign(:input, "")
      |> assign(:loading, false)

    {:ok, socket}
  end

  def render(assigns) do
    ~H"""
    <div>
      <textarea
        phx-blur="run"
        class="rounded-xl border border-slate-200 shadow w-full h-auto p-3 resize-none mb-10"
        rows="10"
      ><%= @input %></textarea>
      <div :if={@loading} class="w-full flex justify-center">
        <img src={~p"/images/loading.png"} class="animate-spin-slow" width="42" />
      </div>
      <div :if={!@loading} class="w-full p-3"><%= @output %></div>
    </div>
    """
  end
end

Time to make it smart

Now we’ll need to make it smart by using our usual suspect (Bumblebee) and using a special model from Grammarly.

We load the model and prepare the serving before using it.

defmodule Grammarlocal.Application do
  @moduledoc false

  use Application

  @impl true
  def start(_type, _args) do
    Nx.global_default_backend(EXLA.Backend)

    {:ok, model} = Bumblebee.load_model({:hf, "grammarly/coedit-large"})
    {:ok, tokenizer} = Bumblebee.load_tokenizer({:hf, "grammarly/coedit-large"})
    {:ok, generation_config} = Bumblebee.load_generation_config({:hf, "grammarly/coedit-large"})

    generation_config = %{generation_config | max_new_tokens: 1500}

    serving =
      Bumblebee.Text.generation(model, tokenizer, generation_config,
        defn_options: [compiler: EXLA]
      )

    children = [
      GrammarlocalWeb.Telemetry,
      {Phoenix.PubSub, name: Grammarlocal.PubSub},
      GrammarlocalWeb.Endpoint,
      {Nx.Serving, serving: serving, name: Grammarlocal.Serving, batch_timeout: 100}
    ]

    opts = [strategy: :one_for_one, name: Grammarlocal.Supervisor]
    Supervisor.start_link(children, opts)
  end
end

Then we go back to our GrammarlocalWeb.InputLive module and handle the events as expected


  def handle_event("run", %{"value" => value}, socket) do
    Task.async(fn -> Nx.Serving.batched_run({:local, Grammarlocal.Serving}, value) end)
    {:noreply, socket |> assign(:input, value) |> assign(:loading, true)}
  end

  def handle_info({_, %{results: [%{text: text}]}}, socket) do
    {:noreply, socket |> assign(:output, text) |> assign(:loading, false)}
  end

  def handle_info({:DOWN, _, _, _, :normal}, socket), do: {:noreply, socket}

This will run the model in a Task, capture the result from running said model and handle the Task shutdown process.

In the end, we have a web application that is able to receive text and run it against a model that will apply ortographic corrections based on the model we just provided it!

Desktop time

Now comes the “new thing” from this post - have this as a Desktop application you could install locally and distribute.

For that effect we will use Tauri. We could pull all dependencies and go along that path no issue but I’ve created a proof of concept to install Tauri in your project easily with the library ex_tauri.

How it works?

The ex_tauri will be your friend and install within your _build folder the tauri-cli utility that will be able to ramp up a tauri project for you.

After that we run the tauri cli which will package your Phoenix application with Burrito into a single binary to be used by tauri as a sidecar

In the end, tauri will be able to grab our Phoenix binary, start it up, wait for it to be up and running on the specified port and open a web view with your Liveview application in all it’s glory!

Let’s do it!

Let’s add our dependency

defp deps do
    [
      {:ex_tauri, git: "https://github.com/filipecabaco/ex_tauri"}
    ]
end

Add the config for tauri in our config.exs

config :ex_tauri,
    version: "1.4.0",
    app_name: "Example Desktop",
    host: "localhost",
    port: 4000

Then we need to add a release configuration to our mix.exs

  def project do
    [
      app: :grammarlocal,
      version: "0.1.0",
      elixir: "~> 1.14",
      elixirc_paths: elixirc_paths(Mix.env()),
      start_permanent: Mix.env() == :prod,
      aliases: aliases(),
      deps: deps(),
      releases: [
        # We need this to be named desktop for this PoC
        desktop: [
          steps: [:assemble, &Burrito.wrap/1],
          burrito: [
            targets: [
              # At the moment we still need this really specific names
              "aarch64-apple-darwin": [os: :darwin, cpu: :aarch64]
            ]
          ]
        ]
      ]
    ]
  end

Now you just need to run mix tauri.install which will install all the dependecies in your _build folder and also create a new folder called src-tauri that has all the required code to start up your tauri application.

Run it!

After you have everything installed, you can run the command mix ex_tauri dev. All the arguments from mix ex_tauri will be forwarded to the tauri cli so you can explore other commands like mix ex_tauri info that gives you some information regarding your tauri application.

But with mix ex_tauri dev you will be able to see your application up and running in your Desktop and have your own local Grammarly.

Desktop application running with a Webview showing Liveview

Web View vs Native

Both approaches to the same problem are fine as long as you are aware of the limitations and, more importantly, have fun building stuff regardless of the approach 😁.

Acknowledgments

Conclusion

Back
Total readers: 616