Major Widgets integration solution shows how phase one of Major Widgets integration plan might be implemented.
![]() | Note |
|---|---|
The encircled numbers in Figure 9 serve as reference points for concepts presented in this guide. For example, (1) in the text references |
Fuse ESB Enterprise's kernel provides a runtime environment that provides enterprise support (management, logging, provisioning, security) for the main store (A), where most of the integration applications run. Its embedded services provide the frameworks for implementing these components of the solution:
RESTful service—for creating a JAX-RS application that runs on each auto repair shop terminal (1), enabling customers to input part orders, via an order entry form, over the internet.
Web service—for creating a JAX-WS front end to implement the order entry functionality on each of the in-store terminals, which receive orders from walk-in customers (2) who purchase parts over-the-counter.
camel-cxfcomponent—a routing and integration service component that creates an entry endpoint (3) that exposes Major Widgets routing logic to the outside world as a web service or RESTful service.Routing and integration service—for creating routes (4, 6) that direct orders received from the web/RESTful service entry point through the appropriate store's order processing back end.
Messaging service—for creating a persistent, fault-tolerant clustered messaging system (5, 5a), which ensures that no order is ever lost due to failure of the system, the message broker, or the connections between the message broker and its various clients—the front end content-based router (4) and the back end dynamic router (6).
At Major Widgets main store (A), the order entry front end (routing and messaging agents running inside Fuse ESB Enterprise) is running on that store's main computer system. At each of the four stores (A-D), an instance of the order entry back end (routing agent and back end processing running in Fuse ESB Enterprise) is running on the local computer system.
When the front end web service (3) receives an online order, the routing agent passes it to a content-based router (4) to determine which store to route the part order for further processing. Normally, the order then enters the target store's queue (5), where it waits until the target store retrieves it (6). (With fault tolerance built into the system, if the master broker (5) fails, the system can continue to function with no loss of orders. For details, see Built-in Broker Fault Tolerance.)
In the case of auto repair shops (1), the content-based router routes order requests to the store nearest the customer, based on the submitted zip code. In the case of walk-in customers (2), the auto supply store submits its own zip code to the front end, so the order is always routed to the local store.
When the back end receives the submitted part order, the application employs a dynamic router (6) to look up the parts in the store's database to see if they are in stock. Results depend on whether the customer is an auto repair shop or a walk-in:
Auto repair shop customers
If the parts are available, the order is submitted to the store's back end processing software (8), which informs and bills the customer (1), schedules delivery, updates inventory, and reorders parts accordingly.
If the parts are unavailable, the order is submitted to a processor that generates an error message, which is emailed (9) to the customer (1).
Walk-in customers
If the parts are available, the order is submitted to the store's back end processing software (8), which informs the store clerk (2), updates inventory, and orders parts accordingly. The store clerk retrieves the parts from stock and sells them to the customer over-the-counter.
If the parts are unavailable, the order is submitted to a processor that generates an error message, which is emailed (9) to the local store's email account (2). The store clerk informs the customer, who can then decide whether he wants the store clerk to search the other stores for his parts.
Figure 9 shows how the Major Widgets integration plan implements a fault-tolerant mechanism to protect against loss of orders due to broker failure.
In this scenario, a slave broker (5a), which provides fault tolerance for the messaging system, is running within Fuse ESB Enterprise (FESB) on a separate, backup computer system at the main store. Both master broker and slave broker connect (5.2/5b) to a shared file system located on a storage area network (SAN). In addition, the messaging system has been configured for persistent messaging, storing messages to disk until consumers acknowledge their receipt, to protect against message loss. And all clients have been configured to use the failover transport to enable them to automatically reconnect to the slave broker when the master broker fails.
With the message store set up, the first broker ((5) in the Major Widgets scenario) to grab an exclusive lock on the shared file system becomes the master broker. The slave broker loops, waiting to grab the exclusive lock when it becomes available. When the master broker (5) fails, it immediately shuts down its transport connections (5.1) to all clients and gives up the exclusive lock on the shared file system (5.2). Concurrently, the slave broker (5a) activates its transport connections to all clients (5c) and grabs the file system lock (5b). Doing so, the slave broker becomes master broker. When the former master broker restarts, it becomes the slave broker and waits to grab the exclusive lock whenever it becomes available.
When the master broker fails, all clients configured to fail over reconnect automatically to the next broker in the supplied list—in this case the initial slave broker (5a). Because the Major Widgets scenario employs a cluster of only two brokers, subsequent failures cause the brokers to switch master/slave roles back and forth between them.
For details on persistent messaging, see Configuring Message Broker Persistence. For details on the failover protocol, see Failover Protocol in Fault Tolerant Messaging.







![[Note]](imagesdb/note.gif)



