blog bg

April 09, 2025

Running Serverless Applications in Kubernetes with Knative

Share what you learn in this blog to prepare for your interview, create your forever-free profile now, and explore how to monetize your valuable knowledge.

 

Think about Kubernetes' flexibility and power for serverless app deployment and scalability. Kubernetes takes serverless computing from cheap, scalable, and easy to magic. This powerful container orchestration solution boosts serverless programs, and Knative simplifies integration. Knative with Kubernetes provides scalable, event-driven serverless computing and flexibility. 

 

What is Knative? 

Kubernetes serverless apps run on the open-source Knative platform. Its primary components are events and native serving. Knative Serving runs serverless, scalable services, whereas Knative Eventing handles HTTP requests or cloud events. 

Knative makes managing Kubernetes serverless apps easier by letting you build, launch, and grow apps without using infrastructure. When you use Knative to build microservices, APIs, or event-driven apps, you can focus on the code instead of the servers. 

 

Benefits of Using Knative with Kubernetes 

When combined with Kubernetes, Knative becomes a serverless computing powerhouse. Automatic scaling is a major benefit. Knative scales serverless services up or down according to demand, down to 0 while idle, making it suitable for unexpected traffic patterns. This reduces costs since you only pay for the resources your function utilizes. 

Knative increases Kubernetes flexibility. You can conduct HTTP-based and event-driven operations like reacting to data changes or external triggers. Knative also integrates effortlessly with your Kubernetes environment, letting you use its powerful ecosystem and features without changing your workflow. 

 

Setting Up Knative on Kubernetes 

To use Knative, you need a Kubernetes cluster and kubectl on your local PC. Without a cluster, you can utilize GKE or AKS or Minikube to run a local cluster. 

Install Knative when your cluster is ready. Installing Knative components using kubectl is simple. Knative Serving's YAML file includes all the components needed to serve HTTP-based workloads. 

kubectl apply -f https://github.com/knative/serving/releases/download/v1.9.0/serving.yaml

 

Next, if you want to handle event-driven workloads, you can install Knative Eventing:

kubectl apply -f https://github.com/knative/eventing/releases/download/v1.9.0/eventing.yaml

 

After installation, verify that Knative is running properly with the following command:

kubectl get pods --namespace knative-serving

 

This displays deployed component status. You can install serverless functions once everything is running. 

 

Deploying a Serverless Function with Knative 

After setting up Knative, let's launch a basic serverless function. Create an HTTP-exposed "Hello World" application. 

First, write a basic Python, Node.js, or Go function. Assume the Python code returns "Hello World" when called via URL. 

Next, specify function deployment in a Knative service YAML file. This file contains critical information such the service name, image, and exposure. A simple example: 

apiVersion: serving.knative.dev/v1
kind: Service
metadata:
  name: hello-world
spec:
  template:
    spec:
      containers:
        - image: gcr.io/knative-samples/hello-world-python
          name: hello-world

Kubectl apply -f hello-world.yaml deploys this function to Kubernetes. Once deployed, use this command to get your function's URL:

kubectl get ksvc hello-world

 

This will expose your serverless function via a URL. This URL accepts HTTP requests, and Knative scales the function as required. 

 

Scaling and Managing Serverless Applications with Knative 

One of Knative's best features is automatic scaling. When application traffic drops, Knative may scale your function to zero, saving you money on idle resources. Knative scales up instances to manage traffic, keeping your application performant under severe load. 

You can monitor and manage serverless apps using kubectl. To scale a service manually or change its settings, alter its YAML file and reapply it using kubectl apply. 

 

Conclusion 

Knative enables Kubernetes developers create and scale event-driven applications utilizing serverless computing. Use Kubernetes' wide ecosystem and serverless computing's efficiency and flexibility by integrating serverless applications. As an API, microservice, and event-driven system infrastructure manager, Knative simplifies code writing. Use Knative in your next Kubernetes project to optimize serverless infrastructures.

82 views

Please Login to create a Question