Firebase: Using the Python Admin SDK on Google Cloud Functions
The Google Cloud Functions (GCF) Python runtime was launched into beta at the recently concluded GCP Next. Ever since the announcement I’ve been looking forward to try it out with the Firebase Admin SDK. I finally managed to spend some time on it last week.
As my first experiment with the GCF Python runtime, I wanted to convert one of my existing Python applications into a Cloud Function. A while ago I wrote an article about implementing a simple web service in Python using Flask and Firebase Admin SDK. In that article I deployed the resulting service using Google Compute Engine. In this post I describe how to port that same service over to GCF.
Setting up the development environment
You will need a billing enabled Google Cloud Platform (GCP) project, and a local installation of the gcloud
command-line tool. You will also require Python 3 and the virtualenv
utility. Execute the following commands in a Linux/Unix shell to get the environment set up.
$ gcloud components update
$ gcloud components install beta
$ gcloud config set project <your-project-id>
$ virtualenv -p python3 env
$ source env/bin/activate
(env) $ pip install firebase-admin flask
These commands update the gcloud
utility, and configure it to use your GCP project. Then we create a new virtualenv
sandbox, and install the required Python dependencies.
You also need to set up Google Application Default Credentials in order to locally test the function. Download a service account JSON file for you GCP/Firebase project, and set the required environment variable.
(env) $ export GOOGLE_APPLICATION_CREDENTIALS=path/to/creds.json
Coding the function
Create a new directory for your code, and make it your working directory.
(env) $ mkdir heroes-function
(env) $ cd heroes-function
Then create a requirements.txt
file, and specify the required dependencies for your function. We just need to declare one dependency for this example — firebase-admin
. Listing 1 shows what this file should look like.
Next, create a file named main.py
(file name is important for GCF), and implement the web service as shown in listing 2.
Note that this is a variation of the same web service we developed in the past for the Compute Engine environment. It exposes the same endpoints, and performs the same operations.
The heroes()
function at the bottom of the file is the entry point to our service. It gets invoked by the GCF runtime, and receives a flask.Request
object as the sole argument. Depending on the HTTP method and URL path, we then dispatch the request to one of the four helper methods that perform Firebase Database operations.
Testing it locally
I didn’t find much documentation on testing Python cloud functions locally. Therefore I devised my own mechanism for that. Create another file named main_emulate.py
, and save it with the contents in listing 3.
Listing 3 fires up a local Flask server that accepts all HTTP requests addressed to the /heroes
path, and delegates them to the heroes()
function in main.py
. You can simply execute main_emulate.py
from the command-line to get the test server up and running.
(env) $ python main_emulate.py
The server listens on port 5000. Send it a few requests to make sure everything works as expected.
$ curl -v -X POST -d '{"name":"Spider-Man"}' -H "Content-type: application/json" localhost:5000/heroes
< HTTP/1.0 201 CREATED
< Content-Type: application/json
< Content-Length: 30
< Server: Werkzeug/0.14.1 Python/3.6.1
< Date: Sun, 26 Aug 2018 19:38:44 GMT
<
{"id":"-LKrjiNJYewVUSHSMsHs"}$ curl -v localhost:5000/heroes/-LKrjiNJYewVUSHSMsHs
< HTTP/1.0 200 OK
< Content-Type: application/json
< Content-Length: 22
< Server: Werkzeug/0.14.1 Python/3.6.1
< Date: Sun, 26 Aug 2018 19:40:09 GMT
<
{"name":"Spider-Man"}
Terminate the test server by hitting ctrl + c
when you’re done.
Deploying to the cloud
Deploying to the cloud is just a matter of running the following command.
(env) $ gcloud beta functions deploy heroes --runtime python37 --trigger-http
No need to spin up any VMs or install any other software in the cloud. GCF takes care of all that. Got to admire the convenience of Functions-as-a-Service — isn’t it?
The deployment may take a couple of minutes. When done, the console output shows the base URL of the newly deployed service. The base URL will have the name of our Python function — heroes
— appended to it. Use curl
to send a few requests and try it out. The first request may take a couple of seconds due to cold start, but subsequent requests shouldn’t take more than a couple of hundred milliseconds.
$ time curl -v -X POST -d '{"name": "Iron Man"}' -H "Content-type: application/json" https://us-central1-my-project-id.cloudfunctions.net/heroes
< HTTP/2 201
< content-type: application/json
< function-execution-id: cewm1zwhha87
< x-cloud-trace-context: b074b9245a6bbe45e0f0a51644901be7;o=1
< date: Sun, 26 Aug 2018 19:46:44 GMT
< server: Google Frontend
< content-length: 35
< alt-svc: quic=":443"; ma=2592000; v="44,43,39,35"
<
{
"id": "-LKrlYl3n3D7oc_22cY3"
}real 0m3.050s
user 0m0.034s
sys 0m0.022s$ time curl -v https://us-central1-my-project-id.cloudfunctions.net/heroes/-LKrlYl3n3D7oc_22cY3
< HTTP/2 200
< content-type: application/json
< function-execution-id: cewmln72omz6
< x-cloud-trace-context: a075099fb5e39d02babd238fa7269840;o=1
< date: Sun, 26 Aug 2018 19:47:09 GMT
< server: Google Frontend
< content-length: 25
< alt-svc: quic=":443"; ma=2592000; v="44,43,39,35"
<
{
"name": "Iron Man"
}real 0m0.292s
user 0m0.033s
sys 0m0.021s
I’m calling my function over HTTPS in the above examples. Note that I’m also using the time
command to time my command executions, and we can see how the latency drops from first to the second request. Even though /heroes
is part of the request URL, GCF does not expose it on the flask.Request
object injected into main.py
. When testing locally, we accounted for this behavior of GCF by rewriting the request.path
attribute in main_emulate.py
.
Caveats
Things are looking pretty impressive at this point. But I do have a couple of issues that I’d like to point out as well.
- Error reporting functions like
flask.abort()
do not currently work on GCF, despite they being mentioned in the documentation. This made listing 2 longer and repetitive than it had to be. I believe this is a bug, and hopefully GCF team will fix it soon. - I wish there was a better (and officially documented) approach for locally testing Cloud Functions written in Python. Firebase provides some amazing tools for testing and emulating Node.js Cloud Functions locally. I hope something similar becomes available for Python soon.
- I wish it was possible to use Flask route decorators in my code, instead of having to implement the routing logic myself. I’ve experimented with a few potential solutions, but they didn’t really work when deployed to the cloud. If somebody knows of a way to make this work, please let me know.
Python runtime in GCF is still pretty new, and I’m sure it will get better with time. For now I’ve reported my feedback to the GCP team.
Conclusion
All in all, my initial foray into GCF Python with the Firebase Admin SDK has been a success. I think it is a convenient, scalable and cost-effective way for developers to create Python web services that interact with Firebase. Personally, I’ve been waiting for Python support in GCF for a long time, and I’m so excited to see it finally out. Feel free to share your own experience with the new runtime, and let me know what type of things you would use it for.