I’ve been working for a while on an Elixir/Phoenix + Elm project for writing and publishing online technical documents — math, physics, and suchlike, a major retooling of manuscripta.io with a functional backend and frontend.

The Phoenix 1.3 backend is coming along nicely and has versatile but simple search feature that I’d like to describe (code is on GitHub). Since this is a JSON API, searches are given by a string like


This search will return articles with “blue” and “bird” in the title and with “song” in the list of tags.

It is up to the developer on the client side to decide how her users want to interact with this little search language. What I chose to do is to have a search box in which the user types words separated by spaces (like Google). These are used as keys to search the title and the tags. Thus in practice, the user would type bird blue song , with an option of doing geekier things to constrain or widen the choices, e.g.,


in the case that we want articles or research notes or whatever on cackling birds.

There are two main ingredients in the recipe for the query language. The first is that in Elixir, one can write “composable queries,” that is, queries that fit together like Legos, or, better, like short sections of pipe. The query for an author, for example, is like this:

def for_author(query, author_id) do
from d in query,
where: d.author_id == ^author_id

The query for full-text search is

def has_text(query, term) do
from d in query,
where: ilike(d.content, ^"%#{term}%")

They can be snapped together like this to find the articles by ghost23 which contain the word ladidah somewhere in the text:

Document |> for_author("ghost23") |> has_text("ladidah") |> Repo.all 

Viva the pipe operator!

Second comes the task of translating a query like author=ghost23&text=ladidah . A function parse_query transforms the query string into an array of arrays, namely[["author","ghost23"],["text","ladidah"]] . Each item of the array is of the form [cmd, arg],where cmdtells what kind of search to make, and arg gives the information needed to make that search. Third, a “dispatch function,” Search.by(query, cmd, arg) will consume elements of the array, yielding the needed queries one by one. The dispatcher is just a huge case statement. No big deal:

def by(query, cmd, arg) do
case {cmd, arg} do
{"author",_} ->
for_author(query, arg)
{"title", _} ->
select_by_title(query, arg)

Fourth, and best of all, is the Reducer, for which I thank @michalmuskala on the friendly ElixirForum:

def by_command_list(command_list) do
|> Enum.reduce(Document, fn [cmd, arg],
query -> Query.by(query, cmd, arg) end)
|> Repo.all

This short snippet of code, featuring the cool, powerful and truly awesome reduce function, takes a valid list of lists and transforms it into a database query. To run that query, we just do this:

def by_query_string(query_string) do
|> parse_query_string
|> by_command_list

Or to be like, totally specific, we say this:

Search.by_query_string("title=bird&title=blue&tag=song") .

Not many lines of code to implement a reasonable powerful search language — with lots of room for growth and improvement, of course!

Get the Medium app

A button that says 'Download on the App Store', and if clicked it will lead you to the iOS App store
A button that says 'Get it on, Google Play', and if clicked it will lead you to the Google Play store