In early 2000, Salesforce launched their cloud customer relationship management solution using an infrastructure as a service (IaaS). This used the first-ever web-based application programming interface (API) that transformed the way industries and developers communicate with each other. Now, 21 years later, a billion API calls are made every day just by Salesforce, and almost all software companies rely on this protocol to parse information. An API is a messenger that takes requests, converts them into the server language, and returns responses. Think of it like when you sit at a restaurant and the waiter takes your dinner order. They write down your order for a steak on their notepad (an API request being translated), take it to the kitchen where the chef cooks your dish (program runs), and come back to you with a delicious meal (an API response).
There are a few different types of web APIs; Coherent Spark uses REST. REST is a popular approach to define APIs. With this, development is separated between back- and front-end. Simple to build, closely aligned to HTTP, and allowing a wider range of supported programming languages, REST helps Spark achieve light-speed performance and integrations with multiple systems. Whether it be a quick quote calculator or a point-of-sale system, the easy to ingest API response from Spark can allow for any Excel pricing model to be converted into a web application. This is our way of creating APIs from an Excel file, whether an insurance pricing model or a home mortgage calculator – Spark can convert any Excel logic into an API in seconds!
APIs are very common in the daily world too. Have a quick look at your phone – the weather app calls an API to show you the temperature in New York, the log-in page on your Twitter account does the same to return the correct account, Instagram bots use this technology to add attractive and relevant comments, etc. Here, the sky is literally the limit because even companies like NASA use APIs, for example, to detect the location of objects in space! In general, APIs increase efficiency by allowing quick access to data that can be shared easily. They also improve development speed because data migration is supported better and the flexibility for coding is higher. This leads to better productivity for end-users as they don’t have to jump between apps and of course, better performance. Without APIs, all services that run on cloud-based servers such as online messaging (WhatsApp, Facebook Messenger, Snapchat, etc.), streaming websites like Netflix, or even online banking – would cease to exist. And because customers are getting more and more dependent on this architecture, it’s important to understand what makes a good API.
Have a strong API format
Most companies rely on guidelines that dictate how their API request and response should look. Consistency saves time and money during the initial implementation, as well as during any changes made in the development. The Spark API response follows a structure that makes sure all users can integrate easily. Similar to how at a restaurant there is way of instructing the chef on what has been ordered and for the server to present the food in a particular manner – strictly defined, simple to understand, and above all, known to all parties involved. The API format is where possible error codes and warnings should be mentioned too.
Build a data dictionary
“A data dictionary is a collection of descriptions of the data objects or items in a data model for the benefit of programmers and other who need to refer to them. ” This should be the only source of truth for developers integrating with an API. A good example of where a data dictionary can be helpful is when referring to a date of birth. For example, imagine if the date of birth in the API is called ‘DOB’ on one request and ‘DateOfBirth’ on another. With the help of the dictionary, teams performing the mapping of API request fields can make sure that this field is only called “Date_Of_Birth” in all APIs. On Spark, the data dictionary maintained specifies the descriptions, data types allowed, exemplar values, any validations etc. for every request and response parameter inside every API. This dictionary should be updated regularly. A data dictionary is as important as your chef’s cookbook!
Secure your API
Your order at a restaurant is secured by the fact that you paid for it and a bill is generated with details, such as which table you sit at, which makes sure your meal is not provided to anyone else in the restaurant. Similarly, once an API has been built, it is very important to secure calling them. For example, to call a Spark API, users can register their OAuth2 token with Keycloak and generate a JWT token. Securely available to admins, these are like passwords for viewing the content of your API. The admin can manage the keys through Spark, although they are only visible to them once when created. It’s important to regularly rotate the secret keys and deactivate those that aren’t in use.
Wrap it up in Documentation!
Lastly, to share your API, build a document that will showcase all the important features such as request and response parameters included, their allowed values, error states as well as integration methods like sample cURL statements. For each API created on Coherent Spark, an API Documentation is automatically generated. This allows for better adoption rates as well as a complaint-free development experience in the later stages of integration.
APIs started out for internal use only to allow developers in a company to maintain code efficiently. While this was helpful with privacy, it led to difficulties integrating with third-party platforms. Due to this, more and more open APIs are being created where the public has access to all documentation and data. Enterprises like Salesforce, Microsoft, Twitter, Postman, and many more all publish their APIs online for other businesses to integrate with. Smartly built, secure, and well-managed APIs are a boon for industries across the globe. Start your journey with APIs on Coherent Spark and learn along the way!
Product Analyst for Coherent Spark
Sadhana is a Product Analyst for Coherent Spark and helps build new features and updates on the platform. She has been a part of the team since it's initiation and owns vital pieces including the Spark Assistant, User Management and others. Apart from being primarily involved with the developers and designers in product development and user onboarding, Sadhana is very hands-on with the release cycle and deployment management processes for Spark and manages User Guide updates and release communications to customers!