Coding in Cloud Flow

For Cloud Flow, the Custom Code block is functional in Business or higher edition apps. This block can be tested in the Standard edition.

In Cloud Flow, the Custom Code block allows you to add Node.js based Javascript code for advanced custom functionality.

Node.js Libraries/SDKs

Your Zingy app's server-side backend includes a curated list of Node.js SDKs and Libraries that can be used within your app's Cloud Flow Web Hooks and Background Tasks using the Custom Code block.

For more information about the list of integrated libraries and sdks please see SDK/Library Integration.

Cloud Flow Variables and Functions

The variables and functions that are added to the Cloud Flow (using blocks) can also be referenced by the code within the Custom Code block.

Web Hook Response Streaming

Cloud Flow Web Hooks support advanced options for sending response to the requesting client as detailed below:

Piped (Pass-through)

When using the Piped (Pass-through) response streaming option, the response data needs to be provided as a Readable stream to the znResponsePipe function, the prototype for which is shown below:

znResponsePipe(
    readable,     /* stream.Readable */
    contentType,   /* string */
    noAutoExit,    /* boolean */
    endCbFn        /* function */
)
ParameterTypeDescription

readable

The stream to read from and pipe into the response

contentType

string

The value for the HTTP Content-Type header in the response.

noAutoExit

boolean

By default, the znResponsePipe function will auto exit the Web Hook once the stream is fully read and piped. This optional parameter allows you to disable this behavior and receive a callback notification at the function specified in the endCbFn parameter.

endCbFn

function

When noAutoExit is set to true, this function gets called when the stream has been read or upon error. A string parameter is passed to indicate the status.

Example

The example below shows a large CSV file being retrieved from an external source using the axios library. Since a responseType of stream was specified for axios, it provides a Readable stream at httpResp.data , that is passed to the znResponsePipe function. The contentType parameter is set to text/csv.

let axios = require('axios');
// URL of the large CSV file
let url = 'https://media.githubusercontent.com/media/datablist/sample-csv-files/main/files/customers/customers-100.csv';
axios({
    method: 'get',
    url: url,
    responseType: 'stream'
}).then(function(httpResp) {
  // Call znResponsePipe with the 
  // readable stream (httpResp.data)
  znResponsePipe(httpResp.data, 'text/csv');  
});

The above example conserves your Zingy app's server-side memory by directly streaming the large CSV file, instead of reading it fully and then transferring it as part of the Web Hook's response.

Chunked

When using the Chunked response streaming option, the response data is sent in batches (or chunks) to the requesting client. It sets the HTTP Transfer-Encoding header to chunked in the response.

The following functions facilitate this implementation:

znResponseChunkStart(
    contentType /* string */
)

The znResponseChunkStart function, shown above, is used to initiate the response to the client. The contentType parameter specifies the value for the HTTP Content-Type header in the response.

znResponseChunkAdd(
    data /* string/binary/etc */
)

The znResponseChunkAdd (above), is used to send the next chunk of data to the client. The data parameter specifies the chunk to be sent.

znResponseChunkEnd(
    noAutoExit,    /* boolean */
    endCbFn        /* function */
)

The znResponseChunkEnd function, shown above, is used to signal that all chunks have been sent to the client. By default, this function will auto exit the Web Hook. The noAuthExit parameter when set to true, prevents this and calls the function specified by the endCbFn parameter.

Example

The example below shows the same CSV file (as the Piped example above) being retrieved using the axios library. Since a responseType of stream was specified for axios, it provides a Readable stream at httpResp.data , that is piped into the csv-parser library.

The csv-parser library, parses the stream and produces an object for each line parsed. We are notified about this via the data event. In the handler for the data event, we convert the object to a JSON string and send as a chunk using the znResponseChunkAddfunction.

Upon receiving the end event, the znResponseChunkEnd function is called to close the response and exit the Web Hook.

let csv = require('csv-parser');
let axios = require('axios');
// URL of the large CSV file
let url = 'https://media.githubusercontent.com/media/datablist/sample-csv-files/main/files/customers/customers-100.csv';
axios({
    method: 'get',
    url: url,
    responseType: 'stream'
}).then(function(httpResp) {
    // call znResponseChunkStart to initiate the response
    znResponseChunkStart('text/plain');
    // Pipe the httpResp.data Readable stream into 
    // the Writeable stream provided by the csv-parser
    httpResp.data.pipe(csv())
    .on('data', (obj) => {
        // convert to JSON and send chunk
        // using znResponseChunkAdd
        znResponseChunkAdd(JSON.stringify(obj)); 
    })
    .on('end', () => { 
        // call znResponseChunkEnd to signal end  
        znResponseChunkEnd(false, null);    
    });    
});

Last updated