In my role as a consultant for CleanSlate Technology Group, I am required to enter my time on a periodic basis. Our company currently utilizes the FinancialForce solution within Salesforce for professional services billing. For years, I have found that it is easier to enter my time on a daily basis. This allows me to provide detailed comments on what was accomplished with each time entry. At the start of the month, I work with the service delivery manager and my clients to establish a personal goal. This goal is normally set in total hours for a given month.
I like to make my goal and create a personal burndown chart, so I can quickly see how my billable hours compare to the hours expected. I quickly realized that Salesforce did not provide a default reporting solution that met my needs. Using Microsoft Excel, I was able to create a sheet which included a burndown chart:
I began to wonder how quickly and easily I could automate this process of connecting Salesforce data using traditional cloud options, such as Heroku. After all, Heroku is a Salesforce company.
I knew that Salesforce would be my source of record. I also knew I wanted to build the burndown chart using Highcharts. Now, I needed to figure out how to connect the two items.
The client-side was a quick decision since I maintain a pretty good understanding of Angular. I quickly found the highchairs-angular npm package, which is the official minimal Highcharts wrapper for Angular.
Looking at Heroku's options, I found the Heroku Connect option, which provides two-way Salesforce connectivity out-of-the-box, using Heroku Postgres as the data source available within a Heroku Dyno. This was exactly the solution I needed: It avoids the need to handle connectivity to Salesforce and does not require knowledge of Salesforce data concepts — two things that are not truly required for this effort.
With the source data from Salesforce now available in the Heroku PostgreSQL database, I could quickly build an API for the Angular client using the Spring Boot framework.
As a result, the following data flow, stemming from Salesforce, would create an always up-to-date burndown chart:
To keep things simple for this example, I decided to create a new Object in Salesforce called TimeEntry (TimeEntry__c), which contains two basic fields:
Once configured, the TimeEntry object can be used in the Salesforce UI to populate at least one full month of data. Example data is provided below:
Within Heroku, a new Dyno can be created for use by this repository. Before adding the API source code using Spring Boot, I included a Heroku Postgres add-on. For the purposes of this repository, the Hobby Dev (free) edition will suffice.
One additional table, called GOALS, needs to be added to the public schema:
CREATE TABLE GOALS (
ID INT PRIMARY KEY NOT NULL,
GOAL_MONTH INT NOT NULL,
GOAL_YEAR INT NOT NULL,
GOAL_TARGET INT NOT NULL,
BILLABLE_DAYS INT NOT NULL
);
Some sample data should be added, similar to what is listed below:
INSERT INTO GOALS(ID, GOAL_MONTH, GOAL_YEAR, GOAL_TARGET, BILLABLE_DAYS) VALUES (1, 10, 2020, 250, 21);
INSERT INTO GOALS(ID, GOAL_MONTH, GOAL_YEAR, GOAL_TARGET, BILLABLE_DAYS) VALUES (2, 11, 2020, 250, 19);
INSERT INTO GOALS(ID, GOAL_MONTH, GOAL_YEAR, GOAL_TARGET, BILLABLE_DAYS) VALUES (3, 12, 2020, 250, 21);
The GOALS table should now appear as shown below:
Heroku Connect is a Dyno add-on that provides connectivity between Heroku and Salesforce. For this project, the (free) Demo Edition was selected to demonstrate this add-on's power.
Once added to the Dyno in Heroku, a wizard will provide the necessary connectivity between Salesforce and Heroku PostgreSQL. After connectivity has been established, the TimeEntry (TimeEntry__c) object should be selected and the following fields chosen for synchronization:
For this project, a synchronization interval of 10 minutes will be fine since time entries are only added once a day. Below is an example of the Mappings screen:
The Spring Initializr was used to create a basic API using Maven. The base dependencies are noted below:
<dependency>
<groupId>org.springframework.boot</groupId>
<artifactId>spring-boot-starter-data-jpa</artifactId>
</dependency>
<dependency>
<groupId>org.springframework.boot</groupId>
<artifactId>spring-boot-starter-jersey</artifactId>
</dependency>
<dependency>
<groupId>org.springframework.boot</groupId>
<artifactId>spring-boot-starter-web</artifactId>
</dependency>
<dependency>
<groupId>org.postgresql</groupId>
<artifactId>postgresql</artifactId>
<scope>runtime</scope>
</dependency>
Two entity objects were created for this prototype, both to link to the tables in Heroku Postgres.
The Goal entity included a helper method to compute the average hours required (per day) to stay on track, with the goal defined for a given month and year:
@Entity
@AllArgsConstructor
@NoArgsConstructor
@Data
@Table(schema = "public", name="GOALS")
public class Goal {
@Id
private int id;
@Column(name = "GOAL_MONTH")
private int month;
@Column(name = "GOAL_YEAR")
private int year;
@Column(name = "GOAL_TARGET")
private int goal;
@Column(name = "BILLABLE_DAYS")
private int days;
@Transient
public float getAveragePerDay() {
if (goal > 0 && days > 0) {
return (float) goal / (float) days;
}
return 0f;
}
}
The Time entity object mirrored the Salesforce-based table in Heroku Postgres:
@Entity
@AllArgsConstructor
@NoArgsConstructor
@Data
@Table(schema = "salesforce", name="timeentry__c")
public class Time {
@Id
private long id;
@Column(name = "date__c")
private Date date;
@Column(name = "hours__c")
private float hours;
}
Using JPA repositories and very little custom code, the following RESTful URIs were exposed through the SampleController class in Spring Boot:
GET hostname/goal/{year}/{month} returns a Goal entity, which contains the following payload:
{
"id": 1,
"month": 10,
"year": 2020,
"goal": 250
}
GET hostname/time/{year}/{month} returns a List<Time> result set, which contains the following payload:
[
{
"id": 1,
"date": "2020-10-01T04:00:00.000+00:00",
"hours": 12.5
},
{
"id": 2,
"date": "2020-10-02T04:00:00.000+00:00",
"hours": 10.0
} ...
]
GET hostname/months returns a List<Month> result set, which contains a list of valid month/year combination data:
[
{
"month": 10,
"year": 2020,
"displayName": "10/2020"
} ...
]
With everything in place, the following environment attributes are expected to run the API:
${JDBC_DATABASE_URL} - database URL to access PostgreSQL [spring.datasource.url]
${JDBC_DATABASE_USERNAME} - user name to access the PostgreSQL database [spring.datasource.username]
${JDBC_DATABASE_PASSWORD} - password for the PostgresSQL database [spring.datasource.password]
${PORT} - port for Spring Boot service [server.port] (optional, default is 8080)
The great news here is that, when the Spring Boot API runs in a Heroku Dyno, none of these items have to be set. Instead, each one becomes automatically available.
Establishing a client using the Angular CLI made the process of creating application components and services quite simple. Adding Highcharts to the client became a matter of a few commands:
npm install highcharts-angular --save
npm install highcharts --save
The app.module.ts needed to be updated to make Highcharts available:
...
import { HighchartsChartModule } from 'highcharts-angular';
@NgModule({
imports: [
...
HighchartsChartModule
After using the Angular CLI to create a basic ChartComponent, methods inside the chart.component.ts were created to connect to the Spring Boot service, which set the necessary chartOptions JSON for use by Highcharts. Once the chart configuration data was available, the template for the ChartComponent only needed a small section of HTML to allow the burndown chart into Angular:
<div *ngIf="goal && time && chartOptions" id="container" class="pt-5">
<highcharts-chart [Highcharts]="Highcharts"
[options]="chartOptions"
style="width: 100%; height: 400px; display: block;">
</highcharts-chart>
</div>
Finally, the environments package constants (environment.ts and environment.prod.ts) needed updating to include an api attribute. The local configuration for the project is noted below:
export const environment = {
production: false,
api: 'http://localhost:8080'
};
Finally, following the instructions noted in my Using Heroku for Static Web Content article, I used a Node.js Express server to run the Angular client within Heroku as a static web application.
After getting both the API and client repositories into GitLab and adding each respective Heroku Dyno as a git-remote, I was able to push the source code to Heroku using the following CLI command:
git push Heroku
Once deployed, the Angular client presented the static web content site. This includes a data pick-list of available burndown chart options pulled from data that exists in the Salesforce org. When an option is selected, the resulting client experience updates, as shown below:
In this article, two Heroku Dynos were created to yield the following architecture:
All of the source code referenced in this project can be found in the following repositories on GitLab:
https://gitlab.com/johnjvester/salesforce-integration-api
https://gitlab.com/johnjvester/salesforce-integration-client
Within a very short amount of time, an existing data source within Salesforce was used to complement an existing table in Heroku Postgres using Heroku Connect. With the database in place, a Spring Boot service was created to provide a RESTful API into this data, which could be utilized to create a burndown chart within Angular and Highcharts.
The end result of this example demonstrates not only the efficiency by which the integration could be performed but also the flexibility to cater to the needs of any required business expectations. What is even more exciting is that every aspect of this example added no additional costs to my Heroku account.
Having spent a large portion of my career focused on integrating different systems, the biggest benefit is the small amount of time required to reach a working product. Getting to spend my time focusing on writing business logic to enhance the application experience makes Heroku well worth it.
Have a really great day!
Also published on: https://mulesoft.designgrows.com/integrating-traditional-cloud-development-with-salesforce/