This is a boilerplate to help you adopt Envoy.
There are multiple ways to config Envoy, one of the convenience way to mange different egress traffic is route the traffic by hostname (using virtual hosts). By doing so, you can use one egress port for all your egress dependencies:
static_resources:
listeners:
- name: egress_listener
address:
socket_address:
address: 0.0.0.0
port_value: 12345
filter_chains:
- filters:
- name: envoy.http_connection_manager
typed_config:
"@type": type.googleapis.com/envoy.extensions.filters.network.http_connection_manager.v3.HttpConnectionManager
codec_type: AUTO
stat_prefix: ingress
use_remote_address: true
stat_prefix: http.test.egress
route_config:
name: egress_route_config
virtual_hosts:
- name: foo_service
domains:
- foo.service:8888 # Do not miss the port number here
routes:
- match:
prefix: /
route:
cluster: remote_foo_server
- name: bar_service
domains:
- bar.service:8888 # Do not miss the port number here
routes:
- match:
prefix: /
route:
cluster: remote_bar_server
http_filters:
- name: envoy.router
typed_config:
"@type": type.googleapis.com/envoy.extensions.filters.http.router.v3.Router
dynamic_stats: true
But it will bring you new problem, your code is becoming verbose:
127.0.0.1:12345
where egress port is listeningAnd this library is going to help you deal with these things elegantly.
First, let's tell the library where the egress port is binding. A recommended way is to set the information on the ingress header by request_headers_to_add:
request_headers_to_add:
- header:
key: x-tubi-envoy-egress-port
value: "12345"
- header:
key: x-tubi-envoy-egress-addr
value: 127.0.0.1
You can also set this by the constructor parameters of EnvoyContext
.
For HTTP, you can new the client like this:
const { EnvoyHttpClient, HttpRetryOn } = require("envoy-node");
async function awesomeAPI(req, res) {
const client = new EnvoyHttpClient(req.headers);
const url = `http://foo.service:10080/path/to/rpc`
const request = {
message: "ping",
};
const optionalParams = {
// timeout 1 second
timeout: 1000,
// envoy will retry if server return HTTP 409 (for now)
retryOn: [HttpRetryOn.RETRIABLE_4XX],
// retry 3 times at most
maxRetries: 3,
// each retry will timeout in 300 ms
perTryTimeout: 300,
// any other headers you want to set
headers: {
"x-extra-header-you-want": "value",
},
};
const serializedJsonResponse = await client.post(url, request, optionalParams);
res.send({ serializedJsonResponse });
res.end();
}
For gRPC, you can new the client like this:
const grpc = require("grpc");
const { envoyProtoDecorator, GrpcRetryOn } = require("envoy-node");
const PROTO_PATH = __dirname + "/ping.proto";
const Ping = grpc.load(PROTO_PATH).test.Ping;
// the original client will be decorated as a new class
const PingClient = envoyProtoDecorator(Ping);
async function awesomeAPI(call, callback) {
const client = new PingClient("bar.service:10081", call.metadata);
const request = {
message: "ping",
};
const optionalParams = {
// timeout 1 second
timeout: 1000,
// envoy will retry if server return DEADLINE_EXCEEDED
retryOn: [GrpcRetryOn.DEADLINE_EXCEEDED],
// retry 3 times at most
maxRetries: 3,
// each retry will timeout in 300 ms
perTryTimeout: 300,
// any other headers you want to set
headers: {
"x-extra-header-you-want": "value",
},
};
const response = await client.pathToRpc(request, optionalParams);
callback(undefined, { remoteResponse: response });
}
But they are also decorated to send the Envoy context. You can also specify the optional params (the last one) for features like timeout
/ retryOn
/ maxRetries
/ perTryTimeout
provided by Envoy.
NOTE:
async
signature.timeout
etc.) is not tested and Envoy is not documented how it deal with streaming.const stream = innerClient.clientStream((err, response) => {
if (err) {
// error handling
return;
}
console.log("server responses:", response);
});
stream.write({ message: "ping" });
stream.write({ message: "ping again" });
stream.end();
const stream = innerClient.serverStream({ message: "ping" });
stream.on("error", error => {
// handle error here
});
stream.on("data", (data: any) => {
console.log("server sent:", data);
});
stream.on("end", () => {
// ended
});
const stream = innerClient.bidiStream();
stream.write({ message: "ping" });
stream.write({ message: "ping again" });
stream.on("error", error => {
// handle error here
});
stream.on("data", (data: any) => {
console.log("sever sent:", data);
});
stream.on("end", () => {
stream.end();
});
stream.end();
If you want to have more control of your code, you can also use the low level APIs of this library:
const { envoyFetch, EnvoyContext, EnvoyHttpRequestParams, EnvoyGrpcRequestParams, envoyRequestParamsRefiner } = require("envoy-node");
// ...
const context = new EnvoyContext(
headerOrMetadata,
// specify port if we cannot indicate from
// - `x-tubi-envoy-egress-port` header or
// - environment variable ENVOY_DEFAULT_EGRESS_PORT
envoyEgressPort,
// specify address if we cannot indicate from
// - `x-tubi-envoy-egress-addr` header or
// - environment variable ENVOY_DEFAULT_EGRESS_ADDR
envoyEgressAddr
);
// for HTTP
const params = new EnvoyHttpRequestParams(context, optionalParams);
envoyFetch(params, url, init /* init like original node-fetch */)
.then(res => {
console.log("envoy tells:", res.overloaded, res.upstreamServiceTime);
return res.json(); // or res.text(), just use it as what node-fetch returned
})
.then(/* ... */)
// you are using request?
const yourOldRequestParams = {}; /* url or options */
request(envoyRequestParamsRefiner(yourOldRequestParams, context /* or headers, grpc.Metadata */ ))
// for gRPC
const client = new Ping((
`${context.envoyEgressAddr}:${context.envoyEgressPort}`, // envoy egress port
grpc.credentials.createInsecure()
);
const requestMetadata = params.assembleRequestMeta()
client.pathToRpc(
request,
requestMetadata,
{
host: "bar.service:10081"
},
(error, response) => {
// ...
})
Check out the detail document if needed.
Are you finding it's too painful for you to propagate the context information through function calls' parameter?
If you are using Node.js V8, here is a solution for you:
import { envoyContextStore } from "envoy-node"; // import the store
envoyContextStore.enable(); // put this code when you application init
// for each request, call this:
envoyContextStore.set(new EnvoyContext(req.headers));
// for later get the request, simply:
envoyContextStore.get();
IMPORTANT
set
method is called exactly once per request. Or you will get incorrect context. Please check the document for more details. (TBD: We are working on a blog post for the details.)asyn_hooks
implementation, destroy
is not called if the code is using HTTP keep alive. Please use setEliminateInterval
to set a time for deleting old context data or you may have memory leak. The default (5 mintues) is using if you don't set it.If you are developing the application, you may probably do not have Envoy running. You may want to call the service directly:
Either:
new EnvoyContext({
meta: grpcMetadata_Or_HttpHeader,
/**
* For dev or test environment, we usually don't have Envoy running. By setting directMode = true
* will make all the traffic being sent directly.
* If you set directMode to true, envoyManagedHosts will be ignored and set to an empty set.
*/
directMode: true,
/**
* For easier migrate service to envoy step by step, we can route traffic to envoy for those service
* migrated. Fill this set for the migrated service.
* This field is default to `undefined` which means all traffic will be route to envoy.
* If this field is set to `undefined`, this library will also try to read it from `x-tubi-envoy-managed-host`.
* You can set in envoy config, like this:
*
* ``yaml
* request_headers_to_add:
* - key: x-tubi-envoy-managed-host
* value: hostname:12345
* - key: x-tubi-envoy-managed-host
* value: foo.bar:8080
* ``
*
* If you set this to be an empty set, then no traffic will be route to envoy.
*/
envoyManagedHosts: new Set(["some-hostname:8080"]);
})
or:
export ENVOY_DIRECT_MODE=true # 1 works as well
For developing or running test of this library, you probably need to:
PATH
, or:$ npm run download-envoy
$ export PATH=./node_modules/.bin/:$PATH
$ git add . # or the things you want to commit
$ npm run commit # and answer the commit message accordingly
MIT
this library is init by alexjoverm's typescript-library-starter
Thanks @mattklein123 and Envoy community for questions and answers.
the wrapped class generator of EnvoyClient create a new instance of Envoy client
the API call
json header
header of envoy max retries setting
the header returned by envoy telling upstream is overloaded
header of envoy timeout per try
header of envoy request timeout
the header returned by envoy telling upstream duration
the header set in envoy config for telling this library egress address
the header set in envoy config for telling this library egress port
the optional header set in envoy config for telling a host is managed by envoy so that this library can route envoy or call directly accordingly
assign key value to header, skip empty value
the http header
the key
the value
Disable the context store, all data will be clean up as well. This function is not intended to be call in the application life cycle.
Enable the context store, you should call this function as early as possible, i.e. put it in your application's start.
the fetch function share most of the signature of the original node-fetch but helps you on setting up the request being send to envoy egress port
the params of envoy context as well as request control params (timeout / retry, etc)
the target url, the same as node-fetch's first param
the init, the same as node-fetch's second param
this method will decorate the client constructor to
Check EnvoyClient
for more information
TODO: optimize the typing if the typing of gRPC is updated
Client constructor
to easier migrate from http request using request library, you can use this function to refine the request params directly
request params, can be url string or params object
the context, can be EnvoyContext, grpc.Metadata or HttpHeader
the extra options for the request
get the context previous set in the store of the current execution
get context from the execution tree
the async id
return debug info about the store
get eliminate interval
return the instance of the store
convert http header to grpc.Metadata
the http header
return if the store is enabled
this function is to assign new method to the decorated original client by assigning new method, user can call the method easier with async signature
the function name
read value of the key from header return undefined if not found or empty return first one if multiple values
the header
the key
read value of the key from meata return undefined if not found or empty return first one if multiple values
metadata
key
some HTTP framework will do a tricky thing: the merge the headers into one string fixing it here
a list of array
According to the context store design, this function is required to be called exactly once for a request. Setting multiple calls to this function will lead to context corruption.
the context you want to set
set the store's eliminate interval, context data older than this and not read will be eventually eliminated
time in milliseconds
clean up will decrease the reference count. if the reference count is 0, it will remove it from the store and decrease its parent's reference count. and try to see if its parent needs to be clean up as well.
the asyncId of the execution needs to be cleaned up
this is to wrap the original bidirectional stream, to insert envoy metadata
the func name
this is to wrap the original client stream func, to insert envoy metadata
the name of the func
this is to wrap the original server stream func, to insert envoy metadata
the name of the func
Generated using TypeDoc
original constructor of gRPC