Subido por niyiki1150

parallel-api-composition-in-aws-ra

Anuncio
Parallel API Composition in AWS
1
Use this architecture if you need to call multiple downstream API endpoints, in parallel or in sequence, to compose a single aggregated
response to your clients. Build a generic, parameterized circuit-breaker workflow, and build a custom workflow composition for each
use case. Each composition calls multiple circuit breaker sub-workflows (one per downstream target). Increase availability of circuit
breaker-related data using command query responsibility segregation (CQRS) to maintain a persistent eventually-consistent cache.
2
3
AWS Cloud
1
gateway
Amazon API
Gateway
(API gateway)
4
composition
resilience
Gateway caching
(response cache)
5
in parallel
AWS Express
Workflows
(API composition)
responses
8 Amazon API
4
Amazon API
Gateway
(HTTP proxy)
2
synchronous
clients
on-premises
5
3
writethrough
caching
6
AWS Lambda
(response
composer)
7
AWS Express
Workflows
(circuit breakers)
open
circuitbreaker
(Amazon ECS)
(microservices)
domain
events
domain
events
7
write
Amazon MemoryDB
for Redis
(persistent cache)
Reviewed for technical accuracy May 17, 2022
Reviewed for technical accuracy Month Day, 2021
© 2022, Amazon Web Services, Inc. or its affiliates. All rights reserved.
CQRS)
cache
Amazon Elastic
Container Service
(Amazon ECS)
(microservices)
6
Lambda
function Amazon
EventBridge
(event bus)
8
AWS Reference Architecture
To allow clients to send synchronous web
requests to your microservices, create an API
gateway using Amazon API Gateway.
To invoke multiple microservices at the same
time, use AWS Express Workflows to create
a workflow with parallel steps.
Use an AWS Lambda function to compose
the multiple responses received from the
downstream microservices into a single
aggregated response to return to clients.
To increase resilience, handle parallel
requests with circuit breakers built with
parameterized nested AWS Step Functions
Express Workflows.
To invoke on-premises microservices, use
API Gateway to proxy the HTTP requests.
Create an Amazon MemoryDB for Redis
database to cache the responses from the
microservices, using the write-through
caching pattern. When a downstream
microservice becomes unavailable or its
latency reaches a threshold, the circuit is
closed and all responses are read directly
from the cache until the circuit reopens.
To increase availability, complement the
cached data with the domains events raised
by the downstream microservices using the
CQRS pattern. Using Amazon EventBridge,
create an event bus to collect the domain
events, then handle them with a Lambda
function to persist the domain aggregates in
MemoryDB.
To increase performance, enable the API
Gateway response cache, adjusting time-tolive (TTL) values and filters according to each
request type.
Descargar