paint-brush
Spotlight on Ask On Databy@progrockrec
205 reads

Spotlight on Ask On Data

by Shawn GordonApril 1st, 2024
Read on Terminal Reader
Read this story w/o Javascript
tldt arrow

Too Long; Didn't Read

Ask On Data is the world’s first NLP & AI-based Data Engineering tool. You connect a job to data files that have been registered. AskOnData has autocomplete of commands and columns, which is convenient. The tool is free and available on the Google Play store.
featured image - Spotlight on Ask On Data
Shawn Gordon HackerNoon profile picture


AI has been all the rage since late 2022, and it has many more practical applications than we saw from the blockchain craze. We’re seeing AI crop up in many products and, in many ways, some more useful than others. One of the segments that I’ve been watching is how it can work in a data exploration context. To that end, I ran across a new product called “Ask On Data,” created by the originators of the BI tool Helical Insight, which I wrote about in 2019. They say it is the world’s first NLP & AI-based Data Engineering tool, so let’s take a look.

Getting Started

I got an early look at the product a couple of months ago, and they are iterating quickly with updates. The UI was updated even when I started to write this piece a few weeks ago. The docs are being kept up to date, though, which is what I used to try out the product. The docs are clean and succinct, making them easy to reference.


The UI presents a clean interface, and the first thing you need to do is add some data, which is done with that center icon in the upper right:


initial screen


A robust choice of connectors for a product just coming out of the gate exists. I tried some voter registration Excel files I was just looking at, so we go with the Flat Files option here.


connector options



Now, we go back and create a new job. The icon on the home page, upper left, that looks like a speech bubble. You connect a job to data files that have been registered, which is convenient because you don’t have to keep loading them for each job.

Working with your data

With my votre role file loaded, I first wanted to sort it by the last name, and I can do that by literally saying “sort by sznamelast ascending”:


main interface



Next, I wanted to count each occurrence of the street name so I’d know how many voters live on each street. AskOnData has autocomplete of commands and columns, which is convenient:


count command


This command gave me the following:


group and count results


Export options are also included, so any of these resulting datasets can be exported to use in another tool. Take a look at the docs for a complete list of currently available transformations.


Those are some quick examples, but it was fascinating to say what I wanted from the data and get it back rapidly. I had an older friend whose data I was working with give it a little try; he hates computers, so he asks me to help him. This was a much more intuitive experience for him than understanding the user interface of something like Excel. That was an eye-opener to me, as computers are like swimming for me, I have done it for so long that I have trouble understanding when people can’t.

Other features

In the non-free version of the application, you can also schedule these as jobs. For example, you can connect to a database, pull weekly sales data, and run it through some aggregation functions. That could be set up to run every Sunday night, and then you come in on Monday morning and have your sales report ready.


The command window also has slash (/) commands that allow you to execute an Excel-like expression, SQL, or BI style command.


slash commands



I tested out some expressions, which were straightforward enough; the SQL commands are also what you’d expect, except you refer to the current working data as df for dataframe. It looked like this:


SQL expression


If you’re an SQL person, then this will feel pretty natural for you. I think these slash commands are very much a work in progress at the moment, but those two worked well for my use case.

Summary

This is a new product, and as such, it is iterating rapidly. Some of the UX flow needs some polish, but I’m told that some of that is related to the training. For example, I had specified the entire command in the earlier count command, but after I issued it, the product asked me to specify what aggregation I wanted to perform, so I had to say count again. My initial hurdle was just the mindset of saying what I wanted. I’ve been using computer software for so long that I’m accustomed to working with it in a particular way. Understanding prompt engineering is definitely the wave of the future, and it behooves all of us to add it to our arsenal.


This is a fun product to work with. I appreciate the unconventional thinking that went into it. There is definitely a segment of the public that can use an easier way to interact with data that doesn’t require that they learn a lot of new skills. Is it a good tool for your use case? Try it and see. I liked it.