(Migrated) Zato best practices


#1

(This message has been automatically imported from the retired mailing list)

The Zato docs already provides some good advice on best practices for
using Zato in real life, e.g. here:

https://zato.io/docs/admin/guide/ha.html
https://zato.io/docs/progguide/logging.html
https://zato.io/docs/progguide/debugging.html
https://zato.io/blog/posts/hot-deploy-api-service.html
https://zato.io/blog/posts/apitest-start.html

Maybe we can collect some more tips and best practices here on the
mailing list.

Some of the issue that are still not fully clear to me:

  • How should a minimal productive environment look like? I guess I need
    a test/development cluster, maybe a staging cluster and a productive
    cluster?

  • How should the relation between clusters and physical machines look
    like to make things more redundant and performant? Should one cluster
    always run on one machine?

  • How should the development cluster be configured? I can imagine that
    it is better to use only one server so errors appear in one log file,
    and not to use a load balancer.

  • How should I configure an IDE so that it sees all the packages that a
    Zato service can access, e.g. to make use of IntelliSense features?

  • How should I version control and backup all the config files,
    credentials, start scripts, service module sources, and keep them
    separate from the log and pid files? Maybe the quickstart command should
    also provide ignore files? Where do I put my own libraries, SQLAlchemy
    models and other support files? Changes in zato_extra_paths are only
    picked up when the server is restarted, can this be avoided?

  • How do I test my services? Are there alternatives or complementary
    testing methods to apitest, maybe more like unit tests for service classes?

  • How do I document and version my services? It would be great to have
    Zato create a central ESB documentation based on docstrings in the
    service classes, is something like that possible or planned?

Any advice or recommendations?


#2

On 07/04/15 16:49, Christoph Zwerschke wrote:

  • How should a minimal productive environment look like? I guess I need
    a test/development cluster, maybe a staging cluster and a productive
    cluster?

  • How should the relation between clusters and physical machines look
    like to make things more redundant and performant? Should one cluster
    always run on one machine?

  • How should the development cluster be configured? I can imagine that
    it is better to use only one server so errors appear in one log file,
    and not to use a load balancer.

I believe it will make most sense to add it to the documentation so it’s
kept in one place and updated as new mechanisms are added to the platform.

But in practice there are no limits and can easily use:

  • A single server without web-admin nor LB configured from Docker
  • A quickstart cluster running on one system
  • A single cluster with servers running on separate systems
  • Several clusters each with several servers and an external LB in front
    of them all
  • How should I configure an IDE so that it sees all the packages that a
    Zato service can access, e.g. to make use of IntelliSense features?

This could be simply a matter of executing:

$ /opt/zato/2.0.3/bin/py -c “import sys; print(’:’.join(sys.path))”

… and adding it to an IDE’s Python path.

If a given IDE can make out how to use eggs + setup.py, you can also
simply point it to:

/opt/zato/2.0.3/eggs
/opt/zato/2.0.3/zato-*

  • How should I version control and backup all the config files,
    credentials, start scripts, service module sources, and keep them
    separate from the log and pid files? Maybe the quickstart command should
    also provide ignore files?

Hm, can you please explain what should be ignored and why? They are all
in separate directories exactly so they don’t interfere with each other.

Where do I put my own libraries, SQLAlchemy
models and other support files?

They belong to zato_extra_paths:

https://zato.io/docs/admin/guide/installing-services.html#zato-extra-paths

Changes in zato_extra_paths are only
picked up when the server is restarted, can this be avoided?

https://mailman-mail5.webfaction.com/pipermail/zato-discuss/2015-April/001092.html

  • How do I test my services? Are there alternatives or complementary
    testing methods to apitest, maybe more like unit tests for service classes?

Not really, no, apitest is the recommended tool right now.

  • How do I document and version my services? It would be great to have
    Zato create a central ESB documentation based on docstrings in the
    service classes, is something like that possible or planned?

It wasn’t planned but can be added - personally I am leaning towards
top-to-bottom approach with processes, activity diagrams, high and
low-level design documents - with integration services being an
implementation rather than a starting point.

But I understand that auto-docs could be useful, sure. Without a
contributor or sponsor the work won’t start immediately though.


#4

Hi! I am late to the party but I recently faced an issue with my own unit testing for zato services.

We have zato running in docker and using the zato libraries are rather hard because of this.

I came up with a way to test my services in the following way using mock.

In my zato-service.py file I added the following under the imports

# -*- coding: utf-8 -*-
# Zato
from __future__ import absolute_import, division
from __future__ import print_function, unicode_literals

import httplib
from datetime import datetime

try:
    from zato.server.service import Service
except ImportError:
    # Override for unit tests
    from mock import Mock

    class Service:
        wsgi_environ = Mock()
        request = Mock()
        response = Mock()
        log_input = Mock()
        logger = Mock()

Basically, I used mock and overwrote the service and some of the service properties (I only mocked the items I used in my service)

Then, to test my service, I had the following unit test in place.

def test_summary_response(self):
    from my_zato_service import myService
    a = myService()
    conn = Mock()
    conn.conn = mock_test_request(
        {
            u'agents': [
                {u'last_modified': u'2017-05-25T17:00:27.061', u'id': 12},
                {u'last_modified': u'2017-02-16T11:51:59.681', u'id': 6},
                {u'last_modified': u'2017-05-29T15:53:22.103', u'id': 16},
                {u'last_modified': u'2017-05-29T15:32:16.379', u'id': 14},
                {u'last_modified': u'2017-05-22T16:18:16.120', u'id': 11},
                {u'last_modified': u'2017-03-09T14:55:25.537', u'id': 8},
                {u'last_modified': u'2017-02-16T11:50:15.620', u'id': 5},
                {u'last_modified': u'2017-05-26T13:46:45.751', u'id': 4},
                {u'last_modified': u'2017-02-08T08:28:54.736', u'id': 3},
                {u'last_modified': u'2017-05-29T15:30:39.907', u'id': 13}
            ],
            u'sync_time': None
        }, 200)
    a.wsgi_environ = {"HTTP_UPSTREAM_HOST": 'SOME_HOST'}
    # HTTP_API_KEY
    a.request.payload = {}
    a.response.payload = {}
    a.logger = logger()
    a.log_input.return_value = None
    a.cid = 12345
    a.outgoing.plain_http = {'eos-agent-summary': conn}

    # Handle
    a.handle()
    response = a.response.payload
    assert type(response) is dict
    assert response['status'] is httplib.OK
    assert type(response['response']) is list
    # Test each item in the dict
    for i in response['response']:

        # Ensure only id and last modified are being returned
        for key in i.keys():
            assert key in ['id', 'last_modified']

Becuase I used a plain http api I could also mock the entire request by having this function

def mock_test_request(return_text, status_code):
    import json
    response = Mock()
    mock_response = Mock()
    mock_response.text = json.dumps(return_text)
    mock_response.json.return_value = return_text
    mock_response.status_code = status_code
    response.get.return_value = mock_response
    response.post.return_value = mock_response
    response.patch.return_value = mock_response
    return response

With this approach I am able to have 100% test coverage for all my zato services. I still need to test this approach with other API response formats but I can assure you that this works with standard rest APIs.

That covers testing my code, I also test my service by using zato api-test and then I further test my integration with performance testing.


#5

Thanks @giannis.katsini - this is very useful!


#6

I’ve written an unpublished blog covering unit tests in zato services which I will share when ready :slight_smile:


#7

Sounds very interesting @giannis.katsini - can’t wait to read it!


#8

Hey @dsuch. I don’t have my blog up yet but I created a repo so long. I’ll write up more indepth documentation soon,

I hope that this helps people who were stuck where I was :slight_smile: