Error while using confluent-kafka python library with AWS lambda
Asked Answered
A

2

8

I am trying to use the confluent-kafka python library to administer my cluster via a lambda function but the function fails with the error:

"Unable to import module 'Test': No module named 'confluent_kafka.cimpl'"

My requirements.txt

requests
confluent-kafka

To create the zip file I moved my code to the site-packages location of the virtual env and zipped everything.

Python Code:

import confluent_kafka.admin
import requests
def lambda_handler(event, context):
    print("Hello World")

I am using the macOS 10.X. On Linux, I noticed that pip install creates a separate confluent_kafka.libs which does not get created on mac

Administration answered 5/9, 2020 at 18:52 Comment(0)
T
8

I created the required layer and can verity that it works.

The technique used includes docker tool described in the recent AWS blog:

Thus for this question, I verified it as follows:

  1. Create empty folder, e.g. mylayer.

  2. Go to the folder and create requirements.txt file with the content of

echo requests > requirements.txt
echo confluent-kafka >> requirements.txt
  1. Run the following docker command:
docker run -v "$PWD":/var/task "lambci/lambda:build-python3.8" /bin/sh -c "pip install -r requirements.txt -t python/lib/python3.8/site-packages/; exit"
  1. Create layer as zip:
zip -r mylayer.zip python > /dev/null
  1. Create lambda layer based on mylayer.zip in the AWS Console. Don't forget to specify Compatible runtimes to python3.8.

  2. Test the layer in lambda using the following lambda function:

import confluent_kafka.admin
import requests

def lambda_handler(event, context):
    print(dir(confluent_kafka.admin))
    print(dir(requests))
    print("Hello World")

The function executes correctly:

['AdminClient', 'BrokerMetadata', 'CONFIG_SOURCE_DEFAULT_CONFIG', 'CONFIG_SOURCE_DYNAMIC_BROKER_CONFIG', 'CONFIG_SOURCE_DYNAMIC_DEFAULT_BROKER_CONFIG', 'CONFIG_SOURCE_DYNAMIC_TOPIC_CONFIG', 'CONFIG_SOURCE_STATIC_BROKER_CONFIG', 'CONFIG_SOURCE_UNKNOWN_CONFIG', 'ClusterMetadata', 'ConfigEntry', 'ConfigResource', 'ConfigSource', 'Enum', 'KafkaException', 'NewPartitions', 'NewTopic', 'PartitionMetadata', 'RESOURCE_ANY', 'RESOURCE_BROKER', 'RESOURCE_GROUP', 'RESOURCE_TOPIC', 'RESOURCE_UNKNOWN', 'TopicMetadata', '_AdminClientImpl', '__builtins__', '__cached__', '__doc__', '__file__', '__loader__', '__name__', '__package__', '__path__', '__spec__', 'concurrent', 'functools']
['ConnectTimeout', 'ConnectionError', 'DependencyWarning', 'FileModeWarning', 'HTTPError', 'NullHandler', 'PreparedRequest', 'ReadTimeout', 'Request', 'RequestException', 'RequestsDependencyWarning', 'Response', 'Session', 'Timeout', 'TooManyRedirects', 'URLRequired', '__author__', '__author_email__', '__build__', '__builtins__', '__cached__', '__cake__', '__copyright__', '__description__', '__doc__', '__file__', '__license__', '__loader__', '__name__', '__package__', '__path__', '__spec__', '__title__', '__url__', '__version__', '_check_cryptography', '_internal_utils', 'adapters', 'api', 'auth', 'certs', 'chardet', 'check_compatibility', 'codes', 'compat', 'cookies', 'delete', 'exceptions', 'get', 'head', 'hooks', 'logging', 'models', 'options', 'packages', 'patch', 'post', 'put', 'request', 'session', 'sessions', 'ssl', 'status_codes', 'structures', 'urllib3', 'utils', 'warnings']
Hello World
Tilbury answered 6/9, 2020 at 1:16 Comment(2)
It seems there is a difference in how wheel does the installation for confluent-kafka on macOS vs Linux. As I mentioned in the question, on Linux pip install creates a separate confluent_kafka.libs and because Lambda runs on Amazon Linux it was expecting a separate library. However, since I was creating the zip on mac it didn't have that. I was able to run the code when I used the virtual env from a linux box.Administration
I have tried this, pip installed confluent-kafka using platform linux/amd64 but it still gives me an import error. can you provide a little more guidance?Grieg
T
2

I hit this even after setting up a layer properly. I was able to work around this problem by setting my layer and function to use the architecture x86_64

Tareyn answered 16/5, 2022 at 19:50 Comment(0)

© 2022 - 2025 — McMap. All rights reserved.