Amendments and rectifications to EwitEase
This article serves as a continuation of my previous blog post about the Ewity e-commerce wrapper I developed. Since its publication, I've received feedback, some of which has been constructive. As always, I appreciate constructive feedback and have made adjustments to the source code accordingly. One recurring suggestion I've recently received is to transition to TypeScript. Taking this advice into consideration, I've migrated the entire project to TypeScript. Additionally, I've made some aesthetic changes to better suit a grocery environment. Moving forward, all updates to the project will be made exclusively in the TypeScript repository.
Another significant change I've made is adopting a proxy for communicating with the Ewity API instead of directly interfacing with it. This decision was driven by security considerations, as I aim to prevent the exposure of our Ewity API key to clients, which could pose significant security risks. In essence, the objectives of the proxy are delineated as follows:
Restrict API requests to originate solely from authorized origins: This measure is primarily aimed at ensuring that requests originate exclusively from our client application, thereby safeguarding against CORS attacks.
Sign each incoming request with an expiration timestamp and validate the signature before processing it with the Ewity API: By signing requests, my aim is to ascertain the integrity and authenticity of the request data using a cryptographic signature. It's important to note that while the client manages authentication within our architecture, the proxy serves as an intermediary between the application and Ewity, facilitating communication. Consequently, there isn't a foolproof method for the proxy to authenticate incoming requests. While signing requests may not offer complete protection against authentication issues, it provides a level of security compared to having no mitigation strategy.
Limit access to specific endpoints: Rather than accepting and forwarding every incoming request with the requisite authentication headers to Ewity, the intention is to restrict access to only the endpoints necessary for our client application. Moreover, we seek to enhance efficiency by consolidating multi-step processes, such as quotation creation on Ewity, into a single step on the client side. This approach enables the proxy to seamlessly manage the required steps, thereby optimizing workflow.
Now that we have a comprehensive understanding of the entire process and what needs to be accomplished, let's put our plan into action. For this proxy setup, we'll leverage Cloudflare workers and utilize Hono as our web framework.
To begin, we'll create our project by executing the following command in our terminal:
npm create hono@latest ewitproxy
During the project creation process, I selected the following options:
eyaadh@Ahmeds-MacBook-Pro-173 WebstormProjects % npm create hono@latest ewitproxy
create-hono version 0.5.0
✔ Using target directory … ewitproxy
✔ Which template do you want to use? › cloudflare-workers
cloned honojs/starter#main to /Users/eyaadh/WebstormProjects/ewitproxy
✔ Do you want to install project dependencies? … yes
✔ Which package manager do you want to use? › npm
✔ Installed project dependencies
✔ Copied project files
Get started with: cd ewitproxy
This scaffolds a template project to kickstart our development process. Furthermore, I've installed an additional dependency necessary for generating signatures as per our earlier discussion. To install this dependency, navigate to the project directory and run the following command:
npm i crypto-js
Next, let's modify the wrangler.toml
file in the project's root directory. This file functions as the configuration file for the Wrangler CLI tool, provided by Cloudflare for managing Cloudflare Workers. It includes the settings and metadata specific to our Cloudflare Workers project, including details such as the project name and other project-specific configurations like environment variables. Below is the content of my wrangler.toml
file:
name = "ewitproxy"
main = "src/index.ts"
compatibility_date = "2023-12-01"
node_compat = true
[vars]
POS_API_KEY = "Bearer xxxx"
SECRET_KEY = "xxxx"
Here are the key takeaways from this configuration file:
- The
main
parameter specifies the entry point for our project. - Enabling
node_compat
grants access to Node.js compatibility features, allowing the project to utilize Node.js-specific modules and APIs.
In the [vars]
section:
POS_API_KEY
holds the Ewity API key.SECRET_KEY
serves as a 24-bit pre-shared key utilized for encryption and decryption of signatures within the headers of incoming API requests.For Linux or macOS users, a random key can be generated for project use by executing the command
openssl rand -base64 24
in the terminal.
Now, let's begin writing the necessary source code for the worker. To do this, open the src/index.ts
file and remove all its existing content.
Let's begin by setting up our hono
app and defining a few constants that we'll frequently use across the application. Here's how it's done in code:
import { Hono } from 'hono';
const app = new Hono();
const apiUrl = 'https://api.ewitypos.com/';
let headers = {
'Authorization': '',
'Content-Type': 'application/json'
};
Subsequently, we'll integrate the cors
middleware into our application. This middleware plays a crucial role in managing Cross-Origin Resource Sharing (CORS) restrictions, which aligns with our first objective for the worker. Essentially, with this middleware, we specify the allowed origins permitted to access the proxy resources.
Here's how the updated code for the project looks after implementing this:
import { Hono } from 'hono';
import { cors } from 'hono/cors';
const app = new Hono();
const apiUrl = 'https://api.ewitypos.com/';
let headers = {
'Authorization': '',
'Content-Type': 'application/json'
};
app.use('*', cors({ origin: ['http://localhost:5174'] }));
In essence, this middleware instructs the application to allow requests originating from http://localhost:5174
to access the proxy resources, thereby bypassing the CORS policy. Requests generated from other origins are blocked. In a production environment, you would replace localhost:5174
with the address where your client application is hosted.
Now, let's shift our attention to the middleware responsible for validating signatures. Following the process we've outlined, the client will include a header key named x-signature
, which holds an encrypted signature value, with each request directed to the proxy. Upon receiving these requests, the worker will scrutinize, deconstruct, decrypt, and verify the signature to authenticate the received request.
To grasp how the middleware operates, it's crucial to understand how the signatures are generated. Therefore, while we're at it, let's also create the function responsible for generating signatures within the client application. In the client app, create a file named generateProxySignature.ts
within the src\utils
directory. Below is the code for it:
import CryptoJS from "crypto-js";
const generateSignature = (apiUrl: string) => {
const secretKey: string = import.meta.env.VITE_PROXY_SECRET_KEY
const timestamp = Date.now();
const expiration = timestamp + (60 * 1000) //signature expiration in milliseconds
const encryptedExpiration = CryptoJS.AES.encrypt(expiration.toString(), secretKey).toString()
const dataToSign = apiUrl + encryptedExpiration
const hmac = CryptoJS.HmacSHA256(dataToSign, secretKey)
return `${hmac.toString()}.${encryptedExpiration}`
}
export default generateSignature
Breakdown of the code:
- Importing CryptoJS Module:
import CryptoJS from "crypto-js";
We begin by importing the CryptoJS module. This module is necessary for conducting cryptographic operations like encryption and hashing. It's worth noting that this is the same library we added as an additional dependency in our worker. If you haven't installed it yet on the client app, you can do so by running the command npm i crypto-js
.
- Function Definition:
const generateSignature = (apiUrl: string) => {
Here, we define the generateSignature
function, which takes a single parameter apiUrl
of type string.
- Extracting Secret Key:
const secretKey: string = import.meta.env.VITE_PROXY_SECRET_KEY;
This key is the same pre-shared key that we previously generated and defined in the wrangler.toml
file.
- Generating Timestamp:
const timestamp = Date.now();
Here, we create a timestamp that reflects the current time in milliseconds. This timestamp is then utilized to determine the expiration time of the signature.
- Calculating Expiration:
const expiration = timestamp + (60 * 1000);
Here, we establish the expiration time for the signature. The goal is to create a short-lived signature intended for minimal usage with each request. In this instance, we set the expiration time to 60 seconds (1 minute) after the current timestamp. If necessary, this duration can be shortened for stricter control, although it's important to consider the possibility of a client experiencing poor internet connectivity.
- Encrypting Expiration:
const encryptedExpiration = CryptoJS.AES.encrypt(expiration.toString(), secretKey).toString();
We proceed by encrypting the expiration timestamp using the AES encryption algorithm and the secret key. Subsequently, the encrypted expiration is converted into a string representation.
- Constructing Data to Sign:
const dataToSign = apiUrl + encryptedExpiration;
Here, we concatenate the apiUrl
and the encrypted expiration timestamp to construct the data that needs to be signed.
- Generating HMAC Signature:
const hmac = CryptoJS.HmacSHA256(dataToSign, secretKey);
Next, we generate an HMAC (Hash-based Message Authentication Code) signature using the SHA-256 hashing algorithm. The data to be signed and the secret key are used as inputs to generate the HMAC.
- Returning Signature:
return `${hmac.toString()}.${encryptedExpiration}`;
Finally, we return the HMAC signature appended with a dot (.
) separator and the encrypted expiration timestamp. This combined string represents the generated signature.
- Exporting Function:
export default generateSignature;
We conclude by exporting generateSignature
function, making it accessible to other modules on the client application.
Now that we understand the signature generation process, let's proceed to the worker source code and begin writing the middleware to validate the signatures. Below is the structure of the middleware handler:
import { Hono } from 'hono'
import { cors } from 'hono/cors'
import CryptoJS from 'crypto-js'
import { env } from 'hono/adapter'
// CORS middleware
app.use(async (ctx, next) => {
try {
const { SECRET_KEY } = env<{ SECRET_KEY: string }>(ctx, 'workerd')
const { POS_API_KEY } = env<{ POS_API_KEY: string }>(ctx, 'workerd')
headers.Authorization = POS_API_KEY
const receivedSignature = ctx.req.raw.headers.get('x-signature')
if (!receivedSignature) {
return ctx.json({ error: 'Invalid signature' }, 403)
}
const [receivedHmac, encryptedExpiration] = receivedSignature!.split('.')
const timestamp = Date.now()
const expiration = CryptoJS.AES.decrypt(encryptedExpiration, SECRET_KEY).toString(CryptoJS.enc.Utf8) // Decrypt expiration time
if (timestamp > parseInt(expiration)) {
return ctx.json({ error: 'Signature expired' }, 403)
}
const dataToSign = ctx.req.url + encryptedExpiration
const hmac = CryptoJS.HmacSHA256(dataToSign, SECRET_KEY)
if (hmac.toString() !== receivedHmac) {
return ctx.json({ error: 'Invalid signature' }, 403)
}
await next()
} catch (e) {
console.error(e)
return ctx.newResponse('Invalid signature', 403)
}
})
Breakdown of the code:
- Environment Variables Extraction:
const {SECRET_KEY} = env<{ SECRET_KEY: string }>(ctx, 'workerd')
const {POS_API_KEY} = env<{ POS_API_KEY: string }>(ctx, 'workerd')
headers.Authorization = POS_API_KEY
Here, the middleware extracts the SECRET_KEY
and POS_API_KEY
environment variables using the env
function provided by the hono
framework. As a reminder, we defined these environment variables under the vars
section of our wrangler.toml
file.
- Signature Retrieval:
const receivedSignature = ctx.req.raw.headers.get('x-signature')
This part retrieves the x-signature
header from the incoming request's headers.
- Signature Verification:
if (!receivedSignature) {
return ctx.json({error: 'Invalid signature'}, 403)
}
If the signature is missing, the middleware returns a JSON response with an error message and a status code of 403.
- Decryption of Encrypted Expiration:
const [receivedHmac, encryptedExpiration] = receivedSignature!.split('.')
const expiration = CryptoJS.AES.decrypt(encryptedExpiration, SECRET_KEY).toString(CryptoJS.enc.Utf8) // Decrypt expiration time
The encrypted expiration timestamp is decrypted using the SECRET_KEY
extracted earlier. The expiration timestamp is then converted from its encrypted form to a string representation.
- Signature Expiration Check:
const timestamp = Date.now();
if (timestamp > parseInt(expiration)) {
return ctx.json({error: 'Signature expired'}, 403)
}
The middleware checks if the current timestamp exceeds the expiration timestamp extracted from the decrypted signature. If it does, it returns a JSON response indicating that the signature has expired, with a status code of 403.
- Data Preparation for Signature Verification:
const dataToSign = ctx.req.url + encryptedExpiration
The data to be signed is prepared by concatenating the request URL and the encrypted expiration timestamp.
- HMAC Signature Verification:
const hmac = CryptoJS.HmacSHA256(dataToSign, SECRET_KEY)
if (hmac.toString() !== receivedHmac) {
return ctx.json({error: 'Invalid signature'}, 403)
}
Finally, the middleware computes the HMAC signature using the request data and the SECRET_KEY
. It then compares this signature with the received HMAC signature. If they do not match, it returns a JSON response indicating an invalid signature, with a status code of 403.
- Next Middleware Execution:
await next()
If all checks pass successfully, the middleware proceeds to execute the next middleware/routes in the chain.
- Error Handling:
} catch (e) {
console.error(e)
return ctx.newResponse('Invalid signature', 403)
}
Any errors that occur during the execution of the middleware are caught, logged, and handled by returning a JSON response indicating an invalid signature, with a status code of 403.
Now that we've set up the necessary middleware for the worker, let's consider the routes we need. One of our objectives in determining the proxy's functionality is to restrict client access to only the necessary endpoints. Therefore, for each call made by the client to the Ewity API, we'll create a corresponding route. When designing these routes, our aim should also be to minimize any modifications required in the client application source code.
Let's begin by examining the method used by the client to fetch the categories:
export const usePosStore = defineStore("posStore", {
state: () => ({
method: 'get',
maxBodyLength: Infinity,
url: null,
headers: {
'Authorization': import.meta.env.VITE_POS_API_KEY
},
apiUrl: import.meta.env.VITE_POS_API_URL,
// Other state properties...
}),
actions: {
// Other actions...
fetchCategories() {
return new Promise(async (resolve, reject) => {
const config: AxiosRequestConfig = { ...this.axiosConfig }
config.method = 'GET'
config.url = `${this.apiUrl}v1/products/locations/all?q_Category=${this.selectedCategoryId}&page=${this.selectedCategoryPage}`
try {
const resp = await axios.request(config)
if (resp.data.pagination.total > 0) {
resolve({
categories: resp.data.data,
currentPage: resp.data.pagination.current,
categoriesLastPage: resp.data.pagination.lastPage,
categoriesTotalItems: resp.data.pagination.total
})
}
} catch (e) {
reject(e)
}
})
},
// Other actions...
}
})
Based on the method provided above, our proxy route should resemble the following:
app.get('/products/categories', async (ctx) => {
const page = ctx.req.query('page') ? ctx.req.query('page') : '1'
const response = await fetch(`${apiUrl}v1/products/categories?page=${page}`, {
headers: headers
})
const categories = await response.json()
return ctx.json(categories)
})
Before delving into this route, let's cover some basics. In hono
, each route comprises a path and its corresponding handler. Within the handler function, you have access to the request and response context object. In the provided example, we refer to this context object as ctx
. Path parameters are defined by appending :parameterName
to the route path, such as /products/categories/:iD
. These parameters can then be accessed within the handler using ctx.req.param('parameterName')
. Similarly, query parameters can be accessed within the handler using ctx.req.query('queryParameterName')
.
In the route handler above, we first check if the request includes a page
path parameter. If it's present, we assign its value to the page
variable; otherwise, we default to 1
. Subsequently, a GET request is sent to the Ewity API to fetch categories. The API URL is constructed with the page
query parameter to specify the page of categories to retrieve. Additionally, the headers
object is passed to include the necessary authorization headers. Upon receiving the response from the Ewity API, we parse the response body as JSON and return it as a JSON response for the route.
With this route established, let's also modify the fetchCategories
method on the client to utilize it. Firstly, we will update the apiUrl
in the state to point to the base path of the proxy. Additionally, we will remove the Authorization
key from the headers, as this is now handled by the proxy. Within the method, we will update the header to include the X-Signature
key, which will contain a value generated from the generateSignature
function that we defined earlier. We do this within the method because we want to sign each request with a new signature. Furthermore, we will update the URL in the axios config to reflect the route path we defined on the worker. The updated code will look as follows.
// other imports
import generateSignature from "@/utils/generateProxySignature";
export const usePosStore = defineStore("posStore", {
state: () => ({
axiosConfig: {
method: 'get' as string,
maxBodyLength: Infinity,
url: null as string | null,
headers: {}
} as AxiosRequestConfig,
proxyApiUrl: import.meta.env.VITE_PROXY_API_URL as string,
// Other state properties...
}),
actions: {
// Other actions...
fetchCategories() {
return new Promise(async (resolve, reject) => {
const config: AxiosRequestConfig = { ...this.axiosConfig };
config.method = 'GET';
config.url = `${this.proxyApiUrl}products/categories?page=${this.categoriesPage}`;
config.headers = {
'X-Signature': generateSignature(config.url)
};
try {
const resp = await axios.request(config);
if (resp.data.pagination.total > 0) {
resolve({
categories: resp.data.data,
currentPage: resp.data.pagination.current,
categoriesLastPage: resp.data.pagination.lastPage,
categoriesTotalItems: resp.data.pagination.total
});
}
} catch (e) {
reject(e);
}
});
},
// Other actions...
}
});
All other routes follow a similar pattern and concept. Let's further elucidate by examining one more route, specifically a POST request, where we streamline a multi-step process into a single step on the client's end. Below is the method currently utilized for creating quotations on our client application.
createQuotation(customerId: number): Promise<string> {
return new Promise(async (resolve, reject) => {
//1 . create quotation
const config: AxiosRequestConfig = { ...this.axiosConfig };
config.method = 'POST';
config.url = `${this.apiUrl}v1/quotations`;
config.data = {
"location_id": this.locationId,
"customer_id": customerId,
};
try {
const newQuote = await axios.request(config);
//2. collect the quotation ID for the newly created quotation
const newQuoteId = newQuote.data.data.id;
//3. update the quotation with my cart
config.url = `${this.apiUrl}v1/quotations/${newQuoteId}/lines`;
config.data = { lines: this.myCart };
const updatedQuote = await axios.request(config);
resolve(updatedQuote.data.data.number);
} catch (e) {
reject(e);
}
});
}
The corresponding route for this method on the proxy will be as follows:
app.post('/quotations', async (ctx) => {
const data = await ctx.req.json();
// 1. Create a quotation
const createQuoteResp = await fetch(`${apiUrl}v1/quotations`, {
method: 'POST',
headers: headers,
body: JSON.stringify({
location_id: data.location_id,
customer_id: data.customer_id
})
});
// 2. Gather the quotation details
const quotation = await createQuoteResp.json() as any;
// 3. Update the quotation with the lines
const updateQuoteResp = await fetch(`${apiUrl}v1/quotations/${quotation.data.id}/lines`, {
method: 'POST',
headers: headers,
body: JSON.stringify({
lines: data.lines
})
});
const updatedQuotation = await updateQuoteResp.json();
return ctx.json(updatedQuotation);
});
In this route, we anticipate the client to provide the quotation details in a JSON format within a single request. Upon receiving the request, we extract this data and store it in a constant called data
. Subsequently, we initiate the first step in creating a quotation on Ewity by invoking the appropriate endpoint to generate an empty quotation. Following this, we proceed to update the newly created quotation by adding the required lines. Finally, we furnish a new JSON response for the route containing the updated quotation details obtained from the Ewity API response.
Consequently on the client end we update the method as following:
createQuotation(customerId: number): Promise<string> {
return new Promise(async (resolve, reject) => {
const config: AxiosRequestConfig = {...this.axiosConfig}
config.method = 'POST'
config.url = `${this.proxyApiUrl}quotations`
config.data = {
"location_id": this.locationId,
"customer_id": customerId,
"lines": this.myCart
}
config.headers = {
'X-Signature': generateSignature(config.url)
}
try {
const quote = await axios.request(config)
resolve(quote.data.data.number)
} catch (e) {
reject(e)
}
})
}
In my perspective, this approach enhances the readability and simplifies the understanding of the client-side code. Moreover, since users are not allowed to modify quotes once an order is placed, this method proves to be more efficient. There seems to be no justification for exposing the necessity of two API calls to create a single quotation.
As previously mentioned, given that all other endpoints and routes adhere to the same pattern and concept, I encourage you to practice and implement the remaining routes and adjustments to the client-side methods. For reference, you can access the complete source of my implementation in the GitHub repository for the worker, as well as the updated repository for the client application here.
Once you have added the routes, ensure to conclude index.ts
by exporting the hono
app as demonstrated below:
export default app
To deploy the worker to Cloudflare, you can use the following command:
npx wrangler deploy
My intention in documenting this as a blog post extends beyond merely sharing the amendments I implemented to address security concerns. It serves as an opportunity to share my experience and present a method for securing applications and implementations, particularly in scenarios where altering the architecture of an application is limited. In this case, adhering to the rules we established at the very beginning of the project, as I mentioned in my earlier blog post, where the entire application needed to be serverless without the inclusion of an additional database, was imperative.
Additionally, I aim to introduce you to Cloudflare Workers, a powerful tool with the capability to host serverless applications. Through this blog post, I hope to provide insight into the potential of Cloudflare Workers and their relevance in modern application development. Also, Hono, which is a young web framework.
With that said, I'll conclude here. Until we meet again during another exciting project, where I'll document my journey once more.