search cinemas.
@startuml level2
!includeurl https://raw.githubusercontent.com/RicardoNiepel/C4-PlantUML/master/C4_Container.puml
LAYOUT_TOP_DOWN
'LAYOUT_AS_SKETCH
LAYOUT_WITH_LEGEND
skinparam backgroundColor #FFFFFF
System_Ext(client, "Client")
System_Boundary(c1, 'Micro Services') {
Container(search, "Search", "Microservice", "search cinema data also provides endpoints that perform CRUD operations on indices in Elastic.")
Container(cinemas_catalog, "cinemas catalog", "Microservice", "Cinema catalog, movie schedules")
ContainerDb(elastic, "Elastic Search", "6.3", "Stores cinema information")
}
Rel(client, search, "Uses", "HTTP")
Rel(search, cinemas_catalog, "Uses", "HTTP")
Rel_R(search, elastic, "Uses", "HTTP")
@enduml
- install chrome extension Pegmatite
- pick Transaction programatically
- We can not use go-mysql-elasticsearch. because Aurora serverless does not support binlog
- We don't use logstash for realtime syncing because it's not good at DELETE index for realtime syncing
- We don't use AWS specific service to sync at first. because we have to manage in a different manner in locan and serverless environment
@startuml
skinparam sequence {
ParticipantBorderColor #4B7EBB
ParticipantBackgroundColor #548CD0
ParticipantFontColor White
ArrowColor #B2B2B2
LifeLineBorderColor #A1A1A1
}
== Success ==
"Client" -> "Cinema Catalog Service": POST /cinemas
"Cinema Catalog Service" -> "Cinema Catalog Service": insert a cinema into DB
"Cinema Catalog Service" -> "Search Service": PUT /search/cinemas/:id
"Search Service" -> "Search Service": create a cinema index into ES
"Cinema Catalog Service" <- "Search Service": res done
"Client" <- "Cinema Catalog Service": res done
== Error transaction ==
"Client" -> "Cinema Catalog Service": POST /cinemas
"Cinema Catalog Service" -> "Cinema Catalog Service": insert a cinema into DB
"Cinema Catalog Service" -> "Search Service": PUT /search/cinemas/:id
"Search Service" -[#red]> "Search Service": create a cinema index into ES
"Cinema Catalog Service" <[#red]- "Search Service": res ERROR!!
"Cinema Catalog Service" -> "Cinema Catalog Service": delete a cinema from DB
"Client" <[#red]- "Cinema Catalog Service": res ERROR!!
@enduml
@startuml
skinparam sequence {
ParticipantBorderColor #4B7EBB
ParticipantBackgroundColor #548CD0
ParticipantFontColor White
ArrowColor #B2B2B2
LifeLineBorderColor #A1A1A1
}
== Success ==
"Client" -> "Cinema Catalog Service": PUT /cinemas/:id
"Cinema Catalog Service" -> "Cinema Catalog Service": select for update
"Cinema Catalog Service" -> "Cinema Catalog Service": update a cinema in DB
"Cinema Catalog Service" -> "Search Service": PUT /search/cinemas/:id
"Search Service" -> "Search Service": update a cinema index in ES
"Cinema Catalog Service" <- "Search Service": res done
"Client" <- "Cinema Catalog Service": res done
== Error transaction ==
"Client" -> "Cinema Catalog Service": PUT /cinemas/:id
"Cinema Catalog Service" -> "Cinema Catalog Service": select for update
"Cinema Catalog Service" -> "Cinema Catalog Service": update a cinema in DB
"Cinema Catalog Service" -> "Search Service": PUT /search/cinemas/:id
"Search Service" -[#red]> "Search Service": update a cinema index into ES
"Cinema Catalog Service" <[#red]- "Search Service": res ERROR!!
"Cinema Catalog Service" -> "Cinema Catalog Service": update a cinema in DB to rollback
"Client" <[#red]- "Cinema Catalog Service": res ERROR!!
@enduml
@startuml
skinparam sequence {
ParticipantBorderColor #4B7EBB
ParticipantBackgroundColor #548CD0
ParticipantFontColor White
ArrowColor #B2B2B2
LifeLineBorderColor #A1A1A1
}
== Success ==
"Client" -> "Cinema Catalog Service": DELETE /cinemas/:id
"Cinema Catalog Service" -> "Cinema Catalog Service": select for update
"Cinema Catalog Service" -> "Cinema Catalog Service": delete a cinema in DB
"Cinema Catalog Service" -> "Search Service": DELETE /search/cinemas/:id
"Search Service" -> "Search Service": delete a cinema index in ES
"Cinema Catalog Service" <- "Search Service": res done
"Client" <- "Cinema Catalog Service": res done
== Error transaction ==
"Client" -> "Cinema Catalog Service": DELETE /cinemas/:id
"Cinema Catalog Service" -> "Cinema Catalog Service": select for update
"Cinema Catalog Service" -> "Cinema Catalog Service": delete a cinema in DB
"Cinema Catalog Service" -> "Search Service": DELETE /search/cinemas/:id
"Search Service" -[#red]> "Search Service": delete a cinema index into ES
"Cinema Catalog Service" <[#red]- "Search Service": res ERROR!!
"Cinema Catalog Service" -> "Cinema Catalog Service": insert a cinema in DB to rollback
"Client" <[#red]- "Cinema Catalog Service": res ERROR!!
@enduml
- It will update the cinema of given id.
@startuml
skinparam sequence {
ParticipantBorderColor #4B7EBB
ParticipantBackgroundColor #548CD0
ParticipantFontColor White
ArrowColor #B2B2B2
LifeLineBorderColor #A1A1A1
}
"Client" -> "Search Service": GET /search/cinemas/:id/sync
"Cinema Catalog Service" <- "Search Service": GET the cinema of given id
"Cinema Catalog Service" -> "Search Service": res the cinema
"Search Service" -> "Search Service": put a cinemas index into Elasticsearch
"Client" <- "Search Service": res done
@enduml
- use this way or logstash
@startuml
skinparam sequence {
ParticipantBorderColor #4B7EBB
ParticipantBackgroundColor #548CD0
ParticipantFontColor White
ArrowColor #B2B2B2
LifeLineBorderColor #A1A1A1
}
"Client" -> "Search Service": req /search/cinemas/sync
loop until end of all cinemas
"Cinema Catalog Service" <- "Search Service": req cinemas
"Cinema Catalog Service" -> "Search Service": res cinemas
"Search Service" -> "Search Service": put a cinemas index into Elasticsearch
end
"Client" <- "Search Service": res done
@enduml
- install chrome extension Pegmatite
npm run test
The tests are written in Mocha and the assertions done using Chai
"mocha",
"chai",
"chai-http",
Test files are created under test folder.
Configure AWS Credentials before running this command.
npm run deploy
- make PR from master to deploy-movies-service in GitHub, merge it.
db-migrate up --config db-spec/database.json -e dev
serverless invoke -f db-migrate-up -l
TSLint is a code linter that helps catch minor code quality and style issues.
All rules are configured through tslint.json
.
To run TSLint you can call the main build script or just the TSLint task.
npm run build:live // runs full build including TSLint
npm run lint // runs only TSLint
- plantuml
-
see swagger.yaml
-
API Document endpoints
swagger Spec Endpoint :
http://localhost:8001/api-docs
swagger-ui Endpoint :
http://localhost:8001/docs
- plantuml
This project uses the following environment variables:
Name | Description | Default Value |
---|---|---|
CORS | Cors accepted values | "*" |
The folder structure of this app is explained below:
Name | Description |
---|---|
dist | Contains the distributable (or output) from your TypeScript build. |
node_modules | Contains all npm dependencies |
src | Contains source code that will be compiled to the dist dir |
src/config | Application configuration including environment-specific configs |
src/controllers | Controllers define functions to serve various express routes. |
src/lib | Common libraries to be used across your app. |
src/middlewares | Express middlewares which process the incoming requests before handling them down to the routes |
src/routes | Contain all express routes, separated by module/area of application |
src/models | Models define schemas that will be used in storing and retrieving data from Application database |
src/monitoring | Prometheus metrics |
src/index.ts | Entry point to express app |
test | Contains test code that tests all of codes in src directory |
integration-test | Contains integration test run by npm run integration-test and load test (currently jmeter) npm run jmeter |
db-spec | Contains a db-migration config file, dbspec.md in plantuml, test-data.sql |
migrations | Contains db-migration files by db-migrate create |
postman | Contains postman collection files |
package.json | Contains npm dependencies as well as build scripts |
tsconfig.json | Config settings for compiling source code only written in TypeScript |
tslint.json | Config settings for TSLint code style checking |
serverless.yml | Config settings for Serverless Framework |
nodemon.json | Config settings for nodemon to watch file changed to make local env development smoother |
All the different build steps are orchestrated via npm scripts. Npm scripts basically allow us to call (and chain) terminal commands via npm.
Npm Script | Description |
---|---|
start |
Runs full build and runs node on dist/index.js. Can be invoked with npm start |
build:copy |
copy the *.yaml file to dist/ folder |
build:live |
Full build. Runs ALL build tasks |
build:dev |
Full build. Runs ALL build tasks with all watch tasks |
dev |
Runs full build before starting all watch tasks. Can be invoked with npm dev |
test |
Runs build and run tests using mocha |
lint |
Runs TSLint on project files |
Node.js debugging in VS Code is easy to setup and even easier to use.
Press F5
in VS Code, it looks for a top level .vscode
folder with a launch.json
file.
{
"version": "0.2.0",
"configurations": [
{
"type": "node",
"request": "launch",
"name": "Launch Program",
"program": "${workspaceFolder}/dist/index.js",
"preLaunchTask": "tsc: build - tsconfig.json",
"outFiles": [
"${workspaceFolder}/dist/*js"
]
},
{
// Name of configuration; appears in the launch configuration drop down menu.
"name": "Run mocha",
"request":"launch",
// Type of configuration. Possible values: "node", "mono".
"type": "node",
// Workspace relative or absolute path to the program.
"program": "${workspaceRoot}/node_modules/mocha/bin/_mocha",
// Automatically stop program after launch.
"stopOnEntry": false,
// Command line arguments passed to the program.
"args": ["--no-timeouts", "--compilers", "ts:ts-node/register", "${workspaceRoot}/test/*"],
// Workspace relative or absolute path to the working directory of the program being debugged. Default is the current workspace.
// Workspace relative or absolute path to the runtime executable to be used. Default is the runtime executable on the PATH.
"runtimeExecutable": null,
// Environment variables passed to the program.
"env": { "NODE_ENV": "test"}
}
]
}
The project is using npm module oas-tools
that provides middleware functions for metadata, security, validation and routing, and bundles Swagger UI into Express using OpenAPI 3.0 spec.
It is also possible to set configuration variables, these are them:
Name | Type | Explanation / Values |
---|---|---|
logLevel |
String |
Possible values from less to more level of verbosity are: error, warning, custom, info and debug. Ignored if customLogger is used. Default is info. |
logFile |
String |
Logs file path. Ignored if customLogger is used. |
customLogger |
Object |
Replaces the included logger with the one specified here, so that you can reuse your own logger. logLevel and logFile will be ignored if this variable is used. Null by default. |
controllers |
String |
Controllers location path. |
strict |
Boolean |
Indicates whether validation must stop the request process if errors were found when validating according to specification file. false by default. |
router |
Boolean |
Indicates whether router middleware should be used. True by default. |
validator |
Boolean |
Indicates whether validator middleware should be used. True by default. |
docs |
Boolean |
Indicates whether API docs (Swagger UI) should be available. True by default. The swagger-ui endpoint is acessible at /docs endpoint. |
oasSecurity |
Boolean |
Indicates whether security components defined in the spec file will be handled based on securityFile settings. securityFile will be ignored if this is set to false. Refer to oasSecurity for more information. False by default. |
securityFile |
Object |
Defines the settings that will be used to handle security. Ignored if oasSecurity is set to false. Null by default. |
oasAuth |
Boolean |
Indicates whether authorization will be automatically handled based on grantsFile settings. grantsFile will be ignored if this is set to false. Refer to oasAuth for more information. False by default. |
grantsFile |
Object |
Defines the settings that will be use to handle automatic authorization. Ignored if oasAuth is set to false. Null by default. |
ignoreUnknownFormats |
Boolean |
Indicates whether z-schema validator must ignore unknown formats when validating requests and responses. True by default. |
For setting these variables you can use the function configure and pass to it either a JavaScript object or a yaml/json file containing such object.
const options = {
controllers: basePath + "/routes",
loglevel: "debug",
strict: true,
router: true,
validator: true,
docs: !isProd
};
swaggerTools.configure(options);
To initialise just type the following:
const swaggerDoc = loadDocumentSync(basePath + "/definition/swagger.yaml");
swaggerTools.initialize(swaggerDoc, app, function() {
cb();
});
-
Swagger Router
The Swagger Router connects the Express route handlers found in the controller files on the path specified, with the paths defined in the Swagger specification (swagger.yaml). The routing looks up the correct controller file and exported function based on parameters added to the Swagger spec for each path.
Here is an example for a hello world endpoint:
paths: /hello: get: x-swagger-router-controller: helloWorldRoute operationId: helloWorldGet tags: - /hello description: >- Returns the current weather for the requested location using the requested unit. parameters: - name: greeting in: query description: Name of greeting required: true schema: type: string responses: '200': description: Successful request. content: application/json: schema: $ref: '#/components/schemas/Hello' default: description: Invalid request. content: application/json: schema: $ref: '#/components/schemas/Error'
The fields x-swagger-router-controller
will point the middleware to a helloWorldRoute.ts
file in the route's directory, while the operationId
names the handler function to be invoked.
-
Like many engineering decisions, choosing pagination techniques involves tradeoffs. It’s safe to say that keyset pagination is most applicable for the average site with ordered linear access Since users typically access pages of information in a linear fashion, keyset pagination is usually considered the best choice for paginating ordered records in high-traffic web servers.
-
Cursor-based pagination (aka keyset pagination) is a common pagination strategy that avoids many of the pitfalls of “offset–limit” pagination.