Domo Custom Connector: 3 Design Tips for Building an Effective Product

By Ryan Wilander / Consultant

May 11, 2023

Blog

Reading Time: 7 minutes

Creating a Domo custom connector is a powerful way to enable companies to pull data from 3rd-party API’s into Domo. Domo offers a significant number of out-of-the-box connectors (over 1,000) to common data sources, but it would be impossible for one vendor to create connectors for all possible data sources. This is where the Domo custom connector is important to extend your access to just about any data source API you might have.

Why Build Your own Domo Custom Connector

Developing your own Domo custom connector enables you to be very specific in what data you receive and how you receive it. You can bring in only the data you need, format it precisely as you need it, and store the result in a Domo dataset to use just like any other dataset.

domo custom connector

The alternative to a custom connector is to use a Domo JSON connector (no-code connector). However, this connector has certain limitations which we discuss in this previous blog linked above, about Domo Custom connector vs JSON no-code connector use cases. It explains the different use cases and enables you to determine the best choice for your needs.

This blog covers the custom connectors, which represent the most robust solution in situations where the JSON No-code or pre-built connector options cannot be used.

Custom Connector Prerequisites
  1. Development Experience
    • JavaScript: The Domo custom connector is written in JavaScript ES5. If you have little or no experience with other languages, it may be a bit challenging to pick up JavaScript on the fly but it is manageable with a coding background.
    • JSON: Familiarity with JSON format and files. Most API responses are JSON, so this is what the custom connector will be parsing.
    • RESTful APIs: Make sure you are familiar with the API you’re needing to connect to. This includes knowing its authorization method, endpoints, required headers, and query parameters. The bulk of APIs you will connect to use the REST protocol, though there may be others you will need to use/understand.
  2. Clearly defined metrics & API Documentation
    • Understand what data you will need to obtain from your data source
    • The API you are connecting will normally have documentation. This is crucial, since without this you won’t be able to determine connection requirements. If there is no API documentation, see if you can get in contact with one of the developers or admin of the API.
  3. Familiarize yourself with Domo’s documentation for creating custom connectors.
    • This article does not cover every step of creating a custom connector since Domo’s documentation already does this. Instead, we will go over design choices and certain “gotchas” you will often encounter when developing connectors.

Authorization

The first thing to do is to understand the authorization method your API requires. This is almost always found in the documentation of the API. While reading the documentation, keep an eye out for things such as required HTTP headers, HTTP body requirements, or more than one endpoint required for obtaining authorization.

The design of your authorization code is highly dependent on the API’s design. Make sure you include error handling everywhere you can so debugging is as clear as it can lbe. This would include wrapping your HTTP responses in conditional statements with clear logging.

Data Processing

Once you have been authorized and obtain data from your API, you can now start processing the data. This is where the heavy-lifting development effort takes place. In this step, you define the dataset schema you are going to use in your Domo dataset, and how you are going to get your data to fit that schema.

I recommend using a third-party tool like Postman to not only establish your initial connection but also to get a visual of the data structure your API is returning to you for a given endpoint.

Create 3 Main Functions

Each custom connector should contain at least these three principal functions, whatever else you may add:

1. Get Data Function.

  • This function will be called from multiple places within your connector app, anywhere you specify an endpoint based on the report you select.
  • You should have at minimum a URL parameter for this function. If there is any sort of query parameters that are determined by the endpoint, you can include that here too.
  • Here is also where you can account for any sort of pagination and API limitations (i.e. requests/min, record limits, etc).

The goal of this function is to make it as reusable as possible and have it handle all of the data-gathering responsibilities. See the example below

function callAPI(url, filters){
 let resend = 0;
 let errorCount 0;
 let results = null;

 httprequest.addHeader('header1 name',headerValue1);
 httprequest.addHeader('header2 name',headerValue2);

data = httprequest.get(url+filters) //concat query parameters to endpoint

do {

if (httprequest.getStatusCode() != 200) {
    if (errorCount == 10) {
         DOMO.log('errorcount: ' + errorCount);
         resend = 0;
    } else {
        DOMO.log('errorcount: ' + errorCount);
        resend = 1;
        errorCount = errorCount + 1;
    }
 } else {
    results += data;
    if(results.hasPaginatedData){
    	resend = 1;
    }else{
    	resend = 0;
    }
 }

}while(resend > 0);

return JSON.parse(results);

}

Feel free to add as many httprequest.getStatusCode() checks as you’d like, in fact, we recommend you have more than what we included above. Another common case is to use a delay function for cases where the API is limiting the number of requests coming in.

More often than not, your data will not have a “hasPaginatedData” attribute for your to check, but it will have some sort of indicator that you can use to determine if there is paginated data. You can verify this out ahead of time by visualizing your data with Postman and reviewing your API docs.

Keep in mind that APIs differ in the types of status code replies they send for different reasons, all of which you should be able to find in your API documentation.

2. Create Dataset Function

Now that you have the data, it is time to create the dataset schema using the available Domo Datagrid class methods. You will have to do this for each report you wish to have, so this will actually be a set of functions.

The trickiest and most essential thing to look out for when doing this is making sure you are creating all of the columns you need.

To do this, you should answer 2 questions before developing your Domo dataset:

  • What data fields do I need? All the fields the API returns for this endpoint, or just a select few?
  • Are there any data fields I want to combine into a single field?

Beware of this very common issue that crops up in connector development: You visualize your results in Postman to review all the possible fields that the endpoint is going to return, and you believe you have seen all of the fields. But then for example, in record number “1,243,750”, a new field appears and throws off the rest of the importing of the following records into the Domo dataset.

You can avoid this by checking the API documentation to see if they give you a list of all the possible fields an endpoint can return.

Once you know all of the fields you want to obtain, you can now create a function that looks like the following:

function createSampleDataset(){
   datagrid.addColumn('columnName1', datagrid.DATA_TYPE_STRING);
   datagrid.addColumn('columnName2', datagrid.DATA_TYPE_INTEGER);
   datagrid.addColumn('columnName3', datagrid.DATA_TYPE_DOUBLE);
   ...
   ...
   datagrid.addColumn('columnNameX', datagrid.DATA_TYPE_DOUBLE);
}

Now you’ve got your dataset schema created using Domo Datagrid methods.

3. Parsing Data Function

Next, it’s time to populate this dataset. We will again create a set of functions, typically one for each report.

We must make sure data exists before we put it into a Domo dataset. This is because Domo requires you to add data to the dataset in the same order it is defined in the schema you just created. If there isn’t any data for a specific field for any given record that is fine, we will just have to populate that cell in the dataset with null.

Lets start by taking a look at the code below:

function parseSampleData(data){
 data.forEach(function (row) {
  datagrid.addCell(row.columnName1)
  datagrid.addCell(row.columnName2)
  datagrid.addCell(row.columnName3)
  datagrid.endRow()
 }
 
}

If I were to instead put columnName3 as the first line in this function, then the data for that column will show up under the columnName1 column in my dataset.

Now, take a look at the code below to see how I account for possible missing values

function parseSampleData(data){
 datagrid.addCell(data ? data.columnNameX : undefined);
}

Due to the required behavior of adding cells in the same order as the columns, the above code is very important. If you don’t have the code check it, this line of code could instead be skipped and continue to run and as a result, it will throw off your order.

Your dataset will now have values and finish running, but there will be values from one column that should be in a different column. As the number of columns in the dataset increase, catching this error becomes more difficult.

Last but not least, don’t forget to add

datagrid.endRow()

at the end of the loop in your parse function. If you don’t do this, you won’t see any data in your dataset even though you aren’t getting any errors. I’m putting this here so you can avoid a very common mistake that wastes many minutes of debugging to eventually realize you forgot this one line.

Tying it all together

Now that you have your 3 main functions, it’s time to utilize them. You have some flexibility here on where/how you call these functions based on your connector design & the source API design. But, however you choose to structure these calls, the Domo Datagrid class must be available before trying to add to those calls.

Typically, I follow this structure:

let URL = 'www.api.come/data/reportEndpoint'
let filters = '?filters=123'

if(metadata.report === 'Report1'){
 let data = callAPI(URL,filters);
 if(data.length > 0){
   createSampleDataset()
   parseSampleData(data);
 }else{
 DOMO.log('no data');
 }
}

You can get creative by adding Domo variables to your reports and accessing them by using the Domo-provided metadata object to populate your filters.

Conclusion

The most important thing to do when creating your Domo custom connector is to make it as dynamic, flexible, and efficient as possible. Once you are done with your connector and submit it to Domo for publishing, you will have to wait 1-2 weeks to use the connector. It is important to remember that any time you have to fix something in the connector, you have to resubmit to Domo and go through the publication process / timeframe again.

If you follow the above best practices, then you will have a solid design and will likely only have to perform minor maintenance on your custom connector whenever the source API changes.

Read Other Graphable Domo Dev Articles:

For related analytics and Domo-specific topics, check out these articles:

Also check out these helpful new articles: What is ChatGPT?, Analytics for ChatGPT, and What is Text Analytics?


Graphable helps you make sense of your data by delivering expert data analytics consulting, data engineering, custom dev and applied data science services.

We are known for operating ethically, communicating well, and delivering on-time. With hundreds of successful projects across most industries, we have deep expertise in Financial Services, Life Sciences, Security/Intelligence, Transportation/Logistics, HighTech, and many others.

Thriving in the most challenging data integration and data science contexts, Graphable drives your analytics, data engineering, custom dev and applied data science success. Contact us to learn more about how we can help, or book a demo today.

Still learning? Check out a few of our introductory articles to learn more:

Additional discovery:

    We would also be happy to learn more about your current project and share how we might be able to help. Schedule a consultation with us today. We can also discuss pricing on these initial calls, including Neo4j pricing and Domo pricing. We look forward to speaking with you!


    Graphable helps you make sense of your data by delivering expert analytics, data engineering, custom dev and applied data science services.
     
    We are known for operating ethically, communicating well, and delivering on-time. With hundreds of successful projects across most industries, we have deep expertise in Financial Services, Life Sciences, Security/Intelligence, Transportation/Logistics, HighTech, and many others.
     
    Thriving in the most challenging data integration and data science contexts, Graphable drives your analytics, data engineering, custom dev and applied data science success. Contact us to learn more about how we can help, or book a demo today.

    We are known for operating ethically, communicating well, and delivering on-time. With hundreds of successful projects across most industries, we thrive in the most challenging data integration and data science contexts, driving analytics success.
    Contact us for more information: