-
Notifications
You must be signed in to change notification settings - Fork 0
Home
The javascript ecosystem is one that never rests. We often rely on flavor-of-the-month frameworks and utilities that quickly go out of style, or no longer provide us with the value that newer frameworks do.
What's worse is that often times we couple our valuable business logic with framework specific actors, making it difficult to adopt or migrate to new technologies. For example, within the Angular domain, it is common to disperse business logic across services, controllers, and even directives.
This scenario is present with backend frameworks usage as well. Taking Sails for example, it is common to disperse our business logic amongst controllers. In addition, it is also typically common practice to consume our data access logic via ORM/ODMs such as Mongoose directly within our business logic.
These practices not only litter our business logic with cross-cutting concerns, but also make it challenging to migrate our code to a new framework or adopt new data access technologies.
Further, coupling our logic with framework specific actors makes it difficult to share code between tiers. It'd be smelly to consume angular services or controllers from our express.js controllers, or conversely, our sails controllers from our react stores. Coupling business logic with framework specific actors makes it awkward or difficult to reuse.
Lastly, testing can quickly grow complicated when there are too many actors present in a workflow as it requires a dizzying array of test stubs. Take a controller action in any framework for example. In the controller action, you will most likely find business logic written directly in the action.
That business logic may even directly consume a data access technology, logger, and other cross-cutting concerns. This leads to code that becomes increasingly difficult to test as we are not only testing our controller action and business logic, but all of the other interspersed code that has little or nothing to do with our business logic.
On the contrary, I believe these frameworks offer tremendous benefit, both on the client and server. What I am proposing, however, is to abstract our business logic into composable units by creating code that is completely agnostic of its consumers.
By componentizing our business logic, we can easily test, swap out, rearrange, reuse, and consume bits within any application architecture using any javascript client, server, and data access technologies and frameworks imaginable.
peasy-js is a middle-tier framework that makes it trivial to whimsically swap out UI, backend, and data access frameworks in your applications by creating your business logic in a composable, reusable, scalable, and testable manner. In other words, peasy-js offers guidance in abstracting our business logic into composable units by authoring code that is completely agnostic of its consumers.
It promotes creating business logic that is completely decoupled from its consuming technologies and helps to ensure that separation of concerns separation of concerns (SoC) are adhered to.
I know what you're thinking, "ugh, another framework?". Yes, peasy-js is indeed a microframework. However, chances are if we venture down the path of componentizing our business logic, that we'll end up writing our own micro framework anyhow.
Countless hours have been contributed to the design, development, and testing of peasy-js, supporting almost any workflow imaginable. With a low barrier to entry, I'm hopeful that you'll find the small investment in learning to be well worth your time.
If however, you find that peasy-js isn't quite for you, hopefully you gain some insight into how you might implement your own business layer using some of the patterns in the framework.
Let's check out what peasy-js offers us.
- Easy to use and flexible business and validation rules engine
- Scalability and Reusability (decouples business and validation logic from consuming code and frameworks)
- Easy testability
peasy-js encompasses four main actors. Each actor is outlined below with a brief description and will be covered in more depth throughout the article.
The actor that represents an entity and is responsible for exposing business functionality via commands. These commands encapsulate CRUD and other business related functions.
2. Command
The actor responsible for orchestrating the execution of initialization logic, validation and business rule execution, and command logic (data proxy invocations, workflow logic, etc.), respectively, via the command execution pipeline.
3. Rule
The actor that is used to assert something. A rule can be created to represent anything from a validation rule (field length or required), to a business rule (authorization, price validity, etc.). Rules are consumed by commands and can be chained, configured to execute based on a previous rule's execution, etc. Rules can also be configured to execute code based on the result of their execution.
4. Data Proxy
The actor within the peasy-js framework responsible for data storage and retrieval, that serves as an abstraction layer for data stores that encompass (but not limited to) the following:
- Relational Databases - SQLite, MySQL, Oracle, SQL Server, etc.
- Document (NoSQL) Databases - MongoDB, VelocityDB, etc.
- Services - HTTP, SOAP, etc.
- Cache Stores - Redis, Azure, etc.
- Queues - RabbitMQ, MSMQ, etc.
- File System
- In-memory data stores for testing
Here is a sample of what it might look like to consume your business logic written with peasy-js within an Angular service in the client:
var dataProxy = new CustomerHttpDataProxy();
var service = new CustomerService(dataProxy);
var customer = { name: "Frank Zappa", birthDate: new Date('12/21/1940') };
var command = service.insertCommand(customer);
command.execute(function(err, result) {
if (result.success) {
customer = result.value;
} else {
console.log(result.errors);
}
});Now let's look at an example of what it might look like to consume the same business logic written with peasy-js within an Express.js controller on the server:
var dataProxy = new CustomerMongoDataProxy();
var service = new CustomerService(dataProxy);
var customer = { name: "Frank Zappa", birthDate: new Date('12/21/1940') };
var command = service.insertCommand(customer);
command.execute(function(err, result) {
if (result.success) {
customer = result.value;
} else {
console.log(result.errors);
}
});Notice a difference? The beautiful thing is that there is no difference, excepting a different data proxy injected into the business service in each sample.
Remember that a data proxy is our data access abstraction, and can represent a concrete implementation of file system access, database, queue, cache, in-memory, and HTTP communications.
This abstraction allows us to swap out data proxies based on desired system architectures and configurations, while enforcing SoC and lending itself to code sharing across code bases and facilitating easier testing. What may not be immediately obvious is that this approach always subjects our payloads to the same business logic, regardless of the source or destination of our data. This will all reveal itself soon.
From a consumption standpoint, that really is all there is too it. Consuming your business logic developed with peasy-js will introduce a recognizable theme, regardless of your architecture and the technologies that consume it.
Speaking of architecture, let's turn our attention to a potential architecture that becomes easily achievable when developing our business logic in this manner while exploring the peasy-js actors a bit more in depth.

From left to right, we see that a client application consumes a framework such as Angular, React, Backbone, etc. To achieve maximum scalability, notice that we can simply move the business logic implementation from the UI framework actor implementations (services, controllers, etc.) into its own componentized codebase, or middle-tier.
Next, notice that the middle-tier communicates with the web server. This is made possible by the presence of data proxies. Referring to figure A, the Angular service consuming our business logic instantiates a CustomerHttpDataProxy. As a result, when the insert command is executed, it subjects the supplied payload to any business rules that have been configured. In the event of successful validation, the corresponding insert function of our data proxy will be invoked, and issue a POST against our configured customer endpoint accordingly.
Conversely, notice that the same business logic consumed in our front end is also consumed by our node.js application. Referring to figure B, the express controller consuming our business logic instantiates a CustomerMongoDataProxy. However, this time when the insert command is executed, the corresponding insert function of our data proxy will issue an INSERT against our database, using the mongodb api or an ORD, such as mongoose.
Lastly, because our data proxy implementations adhere to the same interface, we can inject them into our business services depending on how we want to deploy our application. In the diagram, the business services consume data proxies that interact with HTTP services on the client. However, once a request is handled by the web api, the same business services hosted by Node.js are injected with data proxies that interact with a database, queue, cache, file system, etc.
Now that we understand the peasy-js actors from a high level and some of the benefits that they provide, let's walk through potential implementations of the actors in our consumption samples.
CustomerHttpDataProxy = function() {
var request = require('request');
return {
insert: insert
};
function insert(data, done) {
request({
method: 'POST',
url: 'http://localhost:3000/customers',
body: data,
json = true
}, function (error, response, body) {
done(error, body);
}
);
};
};var customerDataProxy = function() {
var connectionString = 'mongodb://localhost:12345/orderEntry';
var mongodb = require('mongodb').MongoClient;
return {
insert: insert
};
function insert(data, done) {
mongodb.connect(connectionString, function(err, db) {
if (err) { return done(err); }
var collection = db.collection('customers');
collection.insert(data, function(err, data) {
db.close();
done(err, data);
});
});
};
};In these data proxy code examples, notice that the data proxies adhere to the same interface, however, abstract away the implementation logic. This is what allows us to scale our application. We can see by simply swapping data proxies that we now have a truly reusable middle tier that is completely agnostic of any consuming code (client or server). This data proxy design concept is really key to achieving scalability and easy testability.
Lastly notice that for brevity, we've only defined an insert function in our data proxies. However, in a real production environment, we would most likely expose all CRUD operations, and perhaps a few more. You can see a full implementation of the CustomerMongoDataProxy here.
var CustomerService = BusinessService.extend({
functions: {
_onInsertCommandInitialization: function(context, done) {
var customer = this.data;
utils.stripAllFieldsFrom(customer).except(['name', 'address']);
utils.stripAllFieldsFrom(customer.address).except(['street', 'zip']);
done();
}
}
}).service;In this example, we've provided initialization logic for the CustomerService's exposed insertCommand that whitelists fields before a call to our data proxy's insert function is invoked. Each default CRUD operation exposed via our business service implementations expose event hooks associated with each command. These methods can be viewed here.
Notice that we use the static BusinessService.extend function, which creates a constructor function exposed via the service member of the returned object. You are also free to use ES6 inheritance or prototypal inheritance if you are more comfortable with these approaches. Samples of both can be found here.
Now that we've defined our initialization logic for our business service's insertCommand, let's create a couple of rules and wire them up accordingly:
var NameRule = Rule.extend({
association: "name",
params: ['name'],
functions: {
_onValidate: function(done) {
if (this.name === "Jimi") {
this._invalidate("Name cannot be Jimi");
}
done();
}
}
});var AgeRule = Rule.extend({
association: "age",
params: ['birthdate'],
functions: {
_onValidate: function(done) {
if (new Date().getFullYear() - this.birthdate.getFullYear() < 50) {
this._invalidate("You are too young");
}
done();
}
}
});Notice that we use the static Rule.extend method in both code examples, which creates a constructor function for us. You are also free to use ES6 inheritance or prototypal inheritance if you are more comfortable with these approaches. Samples of both can be found here.
Now let's wire them up in our CustomerService:
var CustomerService = BusinessService.extend({
functions: {
_onInsertCommandInitialization: function(context, done) {
var customer = this.data;
utils.stripAllFieldsFrom(customer).except(['name', 'address']);
utils.stripAllFieldsFrom(customer.address).except(['street', 'zip']);
done();
},
_getRulesForInsertCommand: function(context, done) {
var customer = this.data;
done(null, [
new NameRule("name", customer.name),
new AgeRule("age", customer.birthDate)
]);
}
}
}).service;In our final piece of code, we've wired up our rules in our business service and have injected them into our insert command execution pipeline. We've done this by supplying an implementation for the _getRulesForInsertCommand() function.
In this sample, we have configured both rules to execute regardless of the outcome of one one another. For example, if the NameRule validation fails, the AgeRule will still be evaluated, and vice versa.
What's great about peasy-js rules is that they are extremely flexible and can be written and configured to support almost any scenario imaginable. For example, we could chain the rule's execution in a way that only executes AgeRule in the event that the NameRule validation succeeds, and vice versa. This is extremely useful when your rules need to acquire data from a data store (a potentially expensive hit).
More information on rules can be found here.
Testing our business logic
Because peasy-js adheres to SOLID programming principles, it becomes very easy to test your business services, commands, and rules.
Let's look at how we can easily test our NameRule.
it("fails when the supplied name is Jimi", () => {
var rule = new NameRule("Jimi");
rule.validate(() => {
expect(rule.valid).toBe(false);
expect(rule.association).toEqual("name");
});
});
it("succeeds when the supplied name is not Jimi", () => {
var rule = new NameRule("James");
rule.validate(() => {
expect(rule.valid).toBe(true);
});
});By keeping our rules simple and focused, not only do they become easy to reuse, but also extremely easy to test. This also applies to testing our business services and custom commands.
Testing is a huge topic in and of itself, so for the sake of brevity, I'll leave you with this final piece of code. Just note that testing your business logic with peasy-js is extremely easy, and many samples can be found here.
An entire sample application has been written that showcases a middle-tier written with peasy-js. This business logic is consumed by an Express.js application hosted in Node.js, and exposes a web api. It's This sample can be viewed here.
peasy-js encourages us to write our business logic that is componentized, reusable, scalable, and easily testable. An awesome side effect of all of this is that it also makes it easy to deploy your code in a multitude of ways. Lastly, it makes it almost trivial to migrate to or adopt new frameworks as your current frameworks age.