Skip to content

Commit

Permalink
style: check misspell
Browse files Browse the repository at this point in the history
Fix #3009.

Signed-off-by: spacewander <[email protected]>
  • Loading branch information
spacewander committed Dec 10, 2020
1 parent 93e2f16 commit 820983c
Show file tree
Hide file tree
Showing 14 changed files with 41 additions and 26 deletions.
15 changes: 15 additions & 0 deletions .github/workflows/spellchecker.yml
Original file line number Diff line number Diff line change
@@ -0,0 +1,15 @@
name: spellchecker
on: [pull_request]
jobs:
misspell:
name: runner / misspell
runs-on: ubuntu-latest
steps:
- name: Check out code.
uses: actions/checkout@v1
- name: Install
run: |
wget -O - -q https://git.io/misspell | sh -s -- -b .
- name: Misspell
run: |
find apisix doc bin t -not -path "t/toolkit/*.lua" -type f | xargs ./misspell
2 changes: 1 addition & 1 deletion apisix/core/ctx.lua
Original file line number Diff line number Diff line change
Expand Up @@ -83,7 +83,7 @@ local function get_parsed_graphql(ctx)
end

if #res.definitions > 1 then
log.warn("Mutliple operations are not supported.",
log.warn("Multiple operations are not supported.",
"Only the first one is handled")
end

Expand Down
2 changes: 1 addition & 1 deletion apisix/core/etcd.lua
Original file line number Diff line number Diff line change
Expand Up @@ -110,7 +110,7 @@ function _M.get_format(res, real_key, is_dir)
-- While in v3, this structure is flatten and all keys related the key asked for are `kvs`
res.body.node = kvs_to_node(res.body.kvs[1])
if not res.body.kvs[1].value then
-- remove last "/" when necesary
-- remove last "/" when necessary
if string.byte(res.body.node.key, -1) == 47 then
res.body.node.key = string.sub(res.body.node.key, 1, #res.body.node.key-1)
end
Expand Down
8 changes: 4 additions & 4 deletions apisix/plugins/response-rewrite.lua
Original file line number Diff line number Diff line change
Expand Up @@ -25,21 +25,21 @@ local schema = {
type = "object",
properties = {
headers = {
description = "new headers for repsonse",
description = "new headers for response",
type = "object",
minProperties = 1,
},
body = {
description = "new body for repsonse",
description = "new body for response",
type = "string",
},
body_base64 = {
description = "whether new body for repsonse need base64 decode before return",
description = "whether new body for response need base64 decode before return",
type = "boolean",
default = false,
},
status_code = {
description = "new status code for repsonse",
description = "new status code for response",
type = "integer",
minimum = 200,
maximum = 598,
Expand Down
2 changes: 1 addition & 1 deletion doc/aws.md
Original file line number Diff line number Diff line change
Expand Up @@ -276,6 +276,6 @@ _TBD_

## Decouple APISIX and etcd3 on AWS

For high availability and state consistency consideration, you might be interested to decouple the **etcd3** as a seperate cluster from **APISIX** not only for performance but also high availability and faught tolerance yet with highly reliable state consistency.
For high availability and state consistency consideration, you might be interested to decouple the **etcd3** as a separate cluster from **APISIX** not only for performance but also high availability and fault tolerance yet with highly reliable state consistency.

_TBD_
2 changes: 1 addition & 1 deletion doc/plugins/kafka-logger.md
Original file line number Diff line number Diff line change
Expand Up @@ -34,7 +34,7 @@

This will provide the ability to send Log data requests as JSON objects to external Kafka clusters.

This plugin provides the ability to push Log data as a batch to you're external Kafka topics. In case if you did not recieve the log data don't worry give it some time it will automatically send the logs after the timer function expires in our Batch Processor.
This plugin provides the ability to push Log data as a batch to you're external Kafka topics. In case if you did not receive the log data don't worry give it some time it will automatically send the logs after the timer function expires in our Batch Processor.

For more info on Batch-Processor in Apache APISIX please refer.
[Batch-Processor](../batch-processor.md)
Expand Down
2 changes: 1 addition & 1 deletion doc/plugins/response-rewrite.md
Original file line number Diff line number Diff line change
Expand Up @@ -35,7 +35,7 @@ response rewrite plugin, rewrite the content returned by the upstream as well as
**senario**:

1. can set `Access-Control-Allow-*` series field to support CORS(Cross-origin Resource Sharing).
2. we can set customized `status_code` and `Location` field in header to achieve redirect, you can alse use [redirect](redirect.md) plugin if you just want a redirection.
2. we can set customized `status_code` and `Location` field in header to achieve redirect, you can also use [redirect](redirect.md) plugin if you just want a redirection.

## Attributes

Expand Down
2 changes: 1 addition & 1 deletion doc/plugins/sls-logger.md
Original file line number Diff line number Diff line change
Expand Up @@ -31,7 +31,7 @@

`sls-logger` is a plugin which push Log data requests to ali cloud [Log Server](https://help.aliyun.com/document_detail/112903.html?spm=a2c4g.11186623.6.763.21321b47wcwt1u) with [RF5424](https://tools.ietf.org/html/rfc5424).

This plugin provides the ability to push Log data as a batch to ali cloud log service. In case if you did not recieve the log data don't worry give it some time it will automatically send the logs after the timer function expires in our Batch Processor.
This plugin provides the ability to push Log data as a batch to ali cloud log service. In case if you did not receive the log data don't worry give it some time it will automatically send the logs after the timer function expires in our Batch Processor.

For more info on Batch-Processor in Apache APISIX please refer
[Batch-Processor](../batch-processor.md)
Expand Down
2 changes: 1 addition & 1 deletion doc/plugins/tcp-logger.md
Original file line number Diff line number Diff line change
Expand Up @@ -33,7 +33,7 @@

This will provide the ability to send Log data requests as JSON objects to Monitoring tools and other TCP servers.

This plugin provides the ability to push Log data as a batch to you're external TCP servers. In case if you did not recieve the log data don't worry give it some time it will automatically send the logs after the timer function expires in our Batch Processor.
This plugin provides the ability to push Log data as a batch to you're external TCP servers. In case if you did not receive the log data don't worry give it some time it will automatically send the logs after the timer function expires in our Batch Processor.

For more info on Batch-Processor in Apache APISIX please refer.
[Batch-Processor](../batch-processor.md)
Expand Down
2 changes: 1 addition & 1 deletion doc/plugins/udp-logger.md
Original file line number Diff line number Diff line change
Expand Up @@ -33,7 +33,7 @@

This will provide the ability to send Log data requests as JSON objects to Monitoring tools and other UDP servers.

This plugin provides the ability to push Log data as a batch to you're external UDP servers. In case if you did not recieve the log data don't worry give it some time it will automatically send the logs after the timer function expires in our Batch Processor.
This plugin provides the ability to push Log data as a batch to you're external UDP servers. In case if you did not receive the log data don't worry give it some time it will automatically send the logs after the timer function expires in our Batch Processor.

For more info on Batch-Processor in Apache APISIX please refer.
[Batch-Processor](../batch-processor.md)
Expand Down
8 changes: 4 additions & 4 deletions t/core/response.t
Original file line number Diff line number Diff line change
Expand Up @@ -61,7 +61,7 @@ GET /t
=== TEST 3: multiple reponse headers
=== TEST 3: multiple response headers
--- config
location = /t {
access_by_lua_block {
Expand All @@ -82,7 +82,7 @@ ccc: ddd
=== TEST 4: multiple reponse headers by table
=== TEST 4: multiple response headers by table
--- config
location = /t {
access_by_lua_block {
Expand All @@ -103,7 +103,7 @@ ccc: ddd
=== TEST 5: multiple reponse headers (add)
=== TEST 5: multiple response headers (add)
--- config
location = /t {
access_by_lua_block {
Expand All @@ -123,7 +123,7 @@ aaa: bbb, bbb
=== TEST 6: multiple reponse headers by table (add)
=== TEST 6: multiple response headers by table (add)
--- config
location = /t {
access_by_lua_block {
Expand Down
16 changes: 8 additions & 8 deletions t/node/upstream-node-dns.t
Original file line number Diff line number Diff line change
Expand Up @@ -84,7 +84,7 @@ passed
return {address = "127.0.0.2"}
end
error("unkown domain: " .. domain)
error("unknown domain: " .. domain)
end
--- request
GET /hello
Expand All @@ -111,7 +111,7 @@ hello world
return {address = "127.0.0." .. count}
end
error("unkown domain: " .. domain)
error("unknown domain: " .. domain)
end
--- config
Expand Down Expand Up @@ -197,7 +197,7 @@ passed
return {address = "127.0.0.2"}
end
error("unkown domain: " .. domain)
error("unknown domain: " .. domain)
end
--- request
GET /hello
Expand All @@ -224,7 +224,7 @@ hello world
return {address = "127.0.0." .. count}
end
error("unkown domain: " .. domain)
error("unknown domain: " .. domain)
end
--- config
Expand Down Expand Up @@ -352,7 +352,7 @@ passed
return {address = "127.0.0." .. count}
end
error("unkown domain: " .. domain)
error("unknown domain: " .. domain)
end
--- config
Expand Down Expand Up @@ -439,7 +439,7 @@ passed
return {address = "127.0.0." .. count}
end
error("unkown domain: " .. domain)
error("unknown domain: " .. domain)
end
--- config
Expand Down Expand Up @@ -506,7 +506,7 @@ proxy request to 127.0.0.[56]:1980
return {address = "127.0.0.1"}
end
error("unkown domain: " .. domain)
error("unknown domain: " .. domain)
end
--- config
Expand Down Expand Up @@ -606,7 +606,7 @@ passed
return {address = "127.0.0." .. count}
end
error("unkown domain: " .. domain)
error("unknown domain: " .. domain)
end
--- config
Expand Down
2 changes: 1 addition & 1 deletion t/plugin/prometheus.t
Original file line number Diff line number Diff line change
Expand Up @@ -912,7 +912,7 @@ passed
=== TEST 49: pipeline of client request with successfuly authorized
=== TEST 49: pipeline of client request with successfully authorized
--- pipelined_requests eval
["GET /hello", "GET /hello", "GET /hello", "GET /hello"]
--- more_headers
Expand Down
2 changes: 1 addition & 1 deletion t/router/graphql.t
Original file line number Diff line number Diff line change
Expand Up @@ -157,7 +157,7 @@ query repo {
--- response_body
hello world
--- error_log
Mutliple operations are not supported
Multiple operations are not supported
Expand Down

0 comments on commit 820983c

Please sign in to comment.