Definitely yes, it’s a non-trivial task which will appear even more complex when you start digging into details.
Fortunately, you don’t need to crack your head trying to solve that.
Just use the extractor.
What about external dependencies?To make extraction to work you need to have all external dependencies to be installed locally.
This should not be an issue, as you can always use environment isolation tools like venv.
Which versions will be listed in output requirements?The answer is: whatever version is installed locally inside current activesite-packages directory.
Installed packages usually have *.
dist-info directories containing metadata about them.
That’s where versions are usually taken from.
Version ControlNow we have two representations of same solution: one is original source and the other is its generated mirror.
It would be reasonable to ask: what should we track in source version control system and how?Ewww.
That’s a bit tough question, because that smells like a duplication.
But this little evil is needed.
Firstly, extraction can be automated.
Secondly, generated code can be kept in an isolated branch.
In this case it’s possible to set up continuous deployment as described in the next section.
Checking-in code generated from one branch into another is left as an exercise up to the reader, though.
DeploymentThe most convenient way to deploy your functions is to set up a push trigger for a branch of a source repository containing extracted functions.
To do that you’ll need a cloudbuild.
yaml file, which describes deployment steps.
Please, refer to “Triggering Cloud Functions deployments” article to see explanation of deployment steps needed for Cloud Functions.
Example build file needed to deploy our view is shown below:# cloudbuild.
yamlsteps:- name: 'gcr.
io/cloud-builders/gcloud' args: [ 'beta', 'functions', 'deploy', 'difficulty-get-data', '–entry-point', 'get_data', '–set-env-vars', 'CORS_ALLOW_ORIGIN=*', '–trigger-http', '–runtime', 'python37', '–memory', '128MB', '–region', 'europe-west1', # watch out your region ] waitFor: ['-'] # wait for nobody, parallelize dir: 'get_data'The image below depicts definition of example trigger.
GCP Cloud Build —Example of source repository trigger definitionFor a full example of working build file refer to demo-services/cloudbuild.
Example report for a build triggered by a push to source repo is shown below:GCP Cloud Build — Example of a source repository-triggered build reportEach build will trigger redeployment of functions.
Example of deployed function’s dashboard is shown below:GCP Cloud Functions — Example function’s dashboardLocal DevelopmentWe are able to destructure our views and push them as functions to a serverless platform now.
But how we are going to develop and test them at all?.There’s no web app yet, all we have is just a bunch of web views.
No problem, let’s make an app then.
And let’s make it to load and register our views dynamically.
ymlFirst of all, let’s define functions.
yml with definitions of functions exposed by the project:- name: difficulty-get-data handler: demo_services.
views:get_dataThat is needed to map names of views to corresponding functions.
And this file might be used to generate cloudbuild.
ymlNext, it makes sense to define local endpoints.
One name and a list of HTTP methods per single endpoint is enough for our purposes.
Let’s define endpoints in local_endpoints.
yml file and here’s definition of our example endpoint:- name: difficulty-get-data methods: ['GET', 'OPTIONS']Note, OPTIONS method is needed to make CORS available.
We want to keep things as similar to production as possible, right?Request injectorIf you have taken a look into the definition of get_data view, you might have noticed it needs request object as its first parameter:…def get_data(request): pretty = 'pretty' in request.
args …This object is just the same flask.
request which you have been using in usual Flask projects before.
The issue is that it’s passed as an argument only in Cloud Functions environment.
Plain Flask does not pass it as an argument, because in plain Flask it’s a global object which has to be imported.
So, how to make our view to run as a function in GCP and as a plain Flask view locally without making changes to it?.Let’s inject request into the local version of view!That can be easily achieved by making with_request decorator:from functools import wrapsfrom flask import requestdef with_request(view_func): @wraps(view_func) def wrapper(*args, **kwargs): return view_func(request, *args, **kwargs) return wrapperThat’s it: just take any function and force request to be its first parameter.
pyFinally, it’s time to make a web app.
We’ll build a simple application dynamically.
To achieve that we need:Read definitions of endpoints from local_endpoints.
Read definitions of functions from functions.
Load functions and inject request object as argument into them.
Setup environment vars.
In our example we’d like to bypass CORS during local development, so CORS_ALLOW_ORIGIN variable is set to *.
Construct URL rules for Flask app.
Make Flask app and run it.
Implementation of that is available as local_run.
Execution of runner spawns a plain Flask app: * Serving Flask app "local_run" (lazy loading) * Environment: production WARNING: Do not use the development server in a production environment.
Use a production WSGI server instead.
* Debug mode: on * Running on http://127.
1:5000/ (Press CTRL+C to quit) * Restarting with stat * Debugger is active!.* Debugger PIN: 935-883-901Phew!.We are able to run same code locally and to deploy it to GCP!ConclusionsServerless is a concept definitely worth being studied and used.
But not all tooling around it is perfect or at least simply good.
Sometimes you are left all alone with lots of manual work.
This article described a possible way to make usage of GCP Cloud Functions easier.
Was is a successful endeavour?.I hope it was, but depending on the circumstances the answer is: might be.
What is definitely true is that you can develop services as usual and extract separate views from them automatically.
Deploys of that views can be done via source repo triggers automatically.
And an application for local development can be build automatically as well.
Feels like a good stuff.
But there are several questions still remaining open.
First of all, using 2 branches or 2 repos for same code doesn’t feel healthy nor totally convenient.
Second, there’s a lack of ability to deploy only changed functions: push of sources triggers redeployment of all declared functions.
Next, a gap between local and production environments can happen to be a way bigger than just a difference in function’s args.
Finally, it’s up to you to put together many stuff, including running functions as views of a local app.
Let’s hope a toolset around serverless will become more mature in future.
.. More details