📚
Tech-Posts
  • README
  • Kafka + Maxwell
  • Kafka
  • Docker
  • MySQL connection via SSH
  • Python
    • Django
    • PyCharm+Docker Dev
    • Pip Tools
    • python project with local packages
  • PHP
    • PhpStorm+Docker Dev
  • Cassandra
  • AWS
    • Cheat Sheet
    • Lambda with Kinesis Event Source Mapping
  • AWS DMS
  • Lambda demo function to produce to Kinesis
  • Deploy a static web page with protection of specific static resources on AWS S3
  • Data Engineer
    • Move Salesforce Files out using Pentaho DI
  • A Pentaho DI Project Readme
  • PowerBI
    • Power BI refer to previous row
Powered by GitBook
On this page
  • Option1
  • Option2
  • Option3
  • Option4 (Applied)
  • Pylint + VSCode issue
  • Gitlab CI/CD script using serverless framework

Was this helpful?

  1. Python

python project with local packages

This fit AWS Lambda function deployment

If we need to deploy python AWS Lambda functions, AWS requires you to zip all the packages into the .zip file then upload. There are multiple options here:

Option1

Install packages into a virtual environment (you can use python venv command or pipenv). When you zip files, move all your codes into the venv folders, then zip them together.

Good:

  • Don't need to add path to tell python where to import packages, since project root is automatically one of import path for python.

  • Using venv is a natural way of manage project.

Bad:

  • It's not easy to setup CI/CD process to deploy.

  • All your codes and installed packages are in the same root, very hard to distinguish.

Option2

Install packages into project root (pip install -r requirements.txt -t .). When you zip files, zip the whole project files (exclude some useless files).

Good:

  • Don't need to add path to tell python where to import packages, since project root is automatically one of import path for python.

  • Easy to setup CI/CD process

Bad:

  • All your codes and installed packages are in the same root, very hard to distinguish.

Option3

if you use serverless framework, you can use Serverless Python Requirements plugin. It's easy, but its approach mechanism is just option1, not changed.

Good:

  • Easy to config.

Bad:

  • It need to install docker if you need to build native packages that are part of your dependencies like Psycopg2, NumPy, Pandas, etc.

  • When you deploy to AWS with this approach, you will find installed packages and your codes are all in the project root as well.

Option4 (Applied)

Install packages into project sub-folder such as packages ( pip install -r requirements.txt -t packages).

Python don't know it can import packages from this folder, so we need to add it to sys path manually. I found a easier way to achieve it: to add a python file packages_load.py :

import sys

print("load packages")
if "packages" not in sys.path:
    sys.path.append("packages")

and in every python file in the project, just import it at head: import packages_load

When you zip files, zip the whole project files (exclude some useless files).

Pylint + VSCode issue

There is a small problem that if you use pylint + VSCode: it will show error on importing (Unable to import XXX). Also, autocomplete for packages doesn't works in VSCode. To solve the problems, need to change .vscode/settings.json:

{
    "python.linting.pylintArgs": [
        "--init-hook",
        "import packages_load"
    ],
    "python.autoComplete.extraPaths": [
        "packages"
    ]
}

Gitlab CI/CD script using serverless framework

example of .gitlab-ci.yml:

image: node:latest

stages:
  - deploy

deploy_testing:
  stage: deploy
  before_script:
    - npm config set prefix /usr/local
    - apt-get update -y && apt-get install python3 python3-pip -y
    - npm install -g serverless
  script:
    - pip3 install -r requirements.txt -t packages
    - serverless deploy --stage testing --region xxx --verbose
  environment: testing
  variables:
    AWS_ACCESS_KEY_ID: '$TESTING_AWS_ACCESS_KEY_ID'
    AWS_SECRET_ACCESS_KEY: '$TESTING_AWS_SECRET_ACCESS_KEY'
    AWS_CLIENT_TIMEOUT: 600000
    SLS_DEBUG: '*'
  only:
    refs:
      - master
    changes:
      - requirements.txt
      - serverless.yml
      - "*.py"
      - "**/*.py"


deploy_production:
  stage: deploy
  before_script:
    - npm config set prefix /usr/local
    - apt-get update -y && apt-get install python3 python3-pip -y
    - npm install -g serverless
  script:
    - pip3 install -r requirements.txt -t packages
    - serverless deploy --region xxx --verbose
  environment: production
  variables:
    AWS_ACCESS_KEY_ID: '$PRODUCTION_AWS_ACCESS_KEY_ID'
    AWS_SECRET_ACCESS_KEY: '$PRODUCTION_AWS_SECRET_ACCESS_KEY'
    AWS_CLIENT_TIMEOUT: 600000
  only:
    refs:
      - tags

Good:

  • Easy to setup CI/CD process

  • Your codes and dependent packages are separated as packages are all in packages sub-folder.

Bad:

  • Need to add packages sub-folder into sys path.

  • Need to do extra settings to work with pylint and VSCode.

PreviousPip ToolsNextPhpStorm+Docker Dev

Last updated 4 years ago

Was this helpful?