Google Cloud Run is a fully managed serverless platform for deploying and scaling container-based applications. Datadog provides monitoring and log collection for Cloud Run functions (formerly Cloud Functions) through the Google Cloud integration.
Datadog Serverless Monitoring is supported for Cloud Run functions (2nd gen). If you want to monitor 1st gen functions, contact your technical account manager.
The profiler is shipped within Datadog tracing libraries. If you are already using APM to collect traces for your application, you can skip installing the library and proceed to enabling the profiler. See Enabling the Node.js Profiler to add the environment variables.
The Datadog sidecar collects logs through a shared volume. To forward logs from your main container to the sidecar, configure your application to write all logs to a location such as shared-volume/logs/*.log using the steps below. You must follow the setup in the GCP UI to add the environment variable DD_SERVERLESS_LOG_PATH and a shared Volume Mount to both the main and sidecar container.
The profiler is shipped within Datadog tracing libraries. If you are already using APM to collect traces for your application, you can skip installing the library and proceed to enabling the profiler. See Enabling the Python Profiler to add the environment variables.
The Datadog sidecar collects logs through a shared volume. To forward logs from your main container to the sidecar, configure your application to write all logs to a location such as shared-volume/logs/*.log using the steps below. You must follow the setup in the GCP UI to add the environment variable DD_SERVERLESS_LOG_PATH and a shared Volume Mount to both the main and sidecar container.
Add functions-framework-api and other dependencies like java-dogstatsd-client to your pom.xml.
Example pom.xml:
<?xml version="1.0" encoding="UTF-8"?><projectxmlns="http://maven.apache.org/POM/4.0.0"xmlns:xsi="http://www.w3.org/2001/XMLSchema-instance"xsi:schemaLocation="http://maven.apache.org/POM/4.0.0 http://maven.apache.org/xsd/maven-4.0.0.xsd"><modelVersion>4.0.0</modelVersion><groupId>functions</groupId><artifactId>functions-hello-world</artifactId><version>1.0.0-SNAPSHOT</version><dependencyManagement><dependencies><dependency><artifactId>libraries-bom</artifactId><groupId>com.google.cloud</groupId><scope>import</scope><type>pom</type><version>26.32.0</version></dependency></dependencies></dependencyManagement><properties><maven.compiler.target>17</maven.compiler.target><maven.compiler.source>17</maven.compiler.source></properties><dependencies><!-- Required for Function primitives --><dependency><groupId>com.google.cloud.functions</groupId><artifactId>functions-framework-api</artifactId><version>1.1.4</version></dependency><dependency><groupId>com.google.cloud.functions.invoker</groupId><artifactId>java-function-invoker</artifactId><version>1.4.0</version></dependency><dependency><groupId>com.datadoghq</groupId><artifactId>java-dogstatsd-client</artifactId><version>4.4.3</version></dependency><dependency><groupId>org.apache.logging.log4j</groupId><artifactId>log4j-api</artifactId><version>2.19.0</version></dependency><dependency><groupId>org.apache.logging.log4j</groupId><artifactId>log4j-core</artifactId><version>2.19.0</version></dependency></dependencies><build><plugins><plugin><!--
Google Cloud Functions Framework Maven plugin
This plugin allows you to run Cloud Functions Java code
locally. Use the following terminal command to run a
given function locally:
mvn function:run -Drun.functionTarget=your.package.yourFunction
--><groupId>com.google.cloud.functions</groupId><artifactId>function-maven-plugin</artifactId><version>0.11.0</version><configuration><functionTarget>functions.HelloWorld</functionTarget></configuration></plugin><plugin><groupId>org.apache.maven.plugins</groupId><artifactId>maven-shade-plugin</artifactId><version>3.2.4</version><executions><execution><phase>package</phase><goals><goal>shade</goal></goals></execution></executions></plugin></plugins></build></project>
Run mvn clean package to update the target directory with the new .jar used in your Dockerfile.
As an alternative to the provided Dockerfile, you can use Artifact Registry to store the images built from your function source code. You can use Google Cloud Build or Buildpacks to build and deploy your image. For example: gcloud builds submit --pack image=LOCATION-docker.pkg.dev/PROJECT_ID/REPO_NAME/IMAGE_NAME
Add dd-java-agent.jar and java-function-invoker.jar to your Dockerfile.
Cloud Run Function code runs with a classpath that includes the function code and its dependencies. The Maven plugin automatically determines the classpath based on the dependencies in pom.xml.
If invoking the Functions Framework directly with the Datadog Agent, update your Dockerfile ENTRYPOINT to include the --classpath and --target options, along with the Java agent flag -javaagent:dd-java-agent.jar:
Replace FUNCTION_JAR with the target JAR generated from the Maven build, including all dependencies.
Replace FUNCTION_TARGET with the function’s entry point (for example, gcfv2.HelloworldApplication).
Example Dockerfile:
# Download Datadog Java AgentFROM maven:3.8.3-openjdk-17 AS build# Set working directoryWORKDIR /# Download the required Maven dependencyRUN mvn dependency:get -Dartifact=com.google.cloud.functions.invoker:java-function-invoker:1.4.0 \
&& mvn dependency:copy -Dartifact=com.google.cloud.functions.invoker:java-function-invoker:1.4.0 -DoutputDirectory=/FROM openjdk:17-jdk# Set the working directory in the containerWORKDIR /ADD'https://dtdg.co/latest-java-tracer' dd-java-agent.jarCOPY --from=build java-function-invoker-1.4.0.jar java-function-invoker.jar# Copy the JAR file into the containerCOPY target/functions-hello-world-1.0.0-SNAPSHOT.jar helloworld.jarENVJAVA_OPTS=-javaagent:dd-java-agent.jar# Expose the port (Cloud Run automatically assigns the actual port via $PORT)ENVPORT=8080EXPOSE 8080 8125/udpENTRYPOINT["java","-javaagent:/dd-java-agent.jar","-jar","/java-function-invoker.jar","--classpath","/helloworld.jar","--target","functions.HelloWorld"]
To deploy the Java function, run the following command from the top-level directory containing your pom.xml and Dockerfile:
gcloud beta run deploy FUNCTION_NAME \
--source . \
--function FUNCTION_TARGET \
--clear-base-image \
--region REGION
Replace REGION with the region where you want to deploy the function.
Replace FUNCTION_TARGET with your function entry point. For example, gcfv2.HelloworldApplication.
Replace FUNCTION_NAME with the name of your Cloud Run function.
Ensure that you set –clear-base-image to deploy your Cloud Function with the Dockerfile.
When setting up your containers, use the same container image deployed in the previous steps.
The profiler is shipped within Datadog tracing libraries. If you are already using APM to collect traces for your application, you can skip installing the library and proceed to enabling the profiler. See Enabling the Java Profiler to add the environment variables.
The Datadog sidecar collects logs through a shared volume. To forward logs from your main container to the sidecar, configure your application to write all logs to a location such as shared-volume/logs/*.log using the steps below. You must follow the setup in the GCP UI to add the environment variable DD_SERVERLESS_LOG_PATH and a shared Volume Mount to both the main and sidecar container.
The profiler is shipped within Datadog tracing libraries. If you are already using APM to collect traces for your application, you can skip installing the library and proceed to enabling the profiler. See Enabling the Go Profiler to add the environment variables.
The Datadog sidecar collects logs through a shared volume. To forward logs from your main container to the sidecar, configure your application to write all logs to a location such as shared-volume/logs/*.log using the steps below. You must follow the setup in the GCP UI to add the environment variable DD_SERVERLESS_LOG_PATH and a shared Volume Mount to both the main and sidecar container.
The profiler is shipped within Datadog tracing libraries. If you are already using APM to collect traces for your application, you can skip installing the library and proceed to enabling the profiler. See Enabling the .NET Profiler to add the environment variables.
The Datadog sidecar collects logs through a shared volume. To forward logs from your main container to the sidecar, configure your application to write all logs to a location such as shared-volume/logs/*.log using the steps below. You must follow the setup in the GCP UI to add the environment variable DD_SERVERLESS_LOG_PATH and a shared Volume Mount to both the main and sidecar container.
To set up logging in your application, see C# Log Collection. To set up trace log correlation, see Correlating .NET Logs and Traces.
If you are deploying a new Cloud Run function for the first time through the console, wait for Cloud Run to create the service and update the placeholder revision image. Then, follow the steps below to add the sidecar container, shared volume mount, startup check, and environment variables.
Go to Volume Mounts and add the same shared volume as you did for the sidecar container.
Note: Save your changes by selecting Done. Do not deploy changes until the final step.
Go to Variables & Secrets and add the same DD_SERVICE environment variable that you set for the sidecar container.
Go to Settings. In the Container start up order drop-down menu, select your sidecar.
Do not use the DD_LOGS_ENABLED environment variable. This variable is only used for the serverless-init install method.
FUNCTION_TARGET can also be found on the source tab inside Google console: Function entry point.
// This line must come before importing the logger.
consttracer=require('dd-trace').init({logInjection:true});constfunctions=require('@google-cloud/functions-framework');const{createLogger,format,transports}=require('winston');constfs=require('fs');// Create a directory
constdirectoryPath='/shared-volume/logs';fs.mkdir(directoryPath,{recursive:true},(err)=>{if(err){console.error(err);return;}console.log('Directory created successfully!');});// Create a file inside the directory
constfilePath=directoryPath+'/index.log';console.log('Directory created successfully!'+filePath);constlogger=createLogger({level:'info',exitOnError:false,format:format.json(),transports:[newtransports.File({filename:filePath}),],});functionhandler(req,res){logger.log('info','Hello simple log!');tracer.dogstatsd.increment('ninja.run.func.sent',1,{environment:'test',runtime:'nodejs'});returnres.send('Welcome to Datadog 💜!');}consthandlerWithTrace=tracer.wrap('example-span',handler)functions.http('httpexample',handlerWithTrace)module.exports=handlerWithTracemodule.exports=logger;
{"name":"updater","version":"1.0.0","description":"test nodejs run function","main":"index.js","scripts":{"test":"echo \"Error: no test specified\" && exit 1"},"keywords":[],"author":"","license":"ISC","dependencies":{"@google-cloud/functions-framework":"^3.4.2","dd-trace":"^5.19.0","winston":"^3.13.1","express":"^4.17.1"}}
importfunctions_frameworkimportddtraceimportloggingfromdatadogimportinitialize,statsdimportosddtrace.patch(logging=True)file_path="/shared-volume/logs/app.log"# This is the path to the shared volumeos.makedirs(os.path.dirname(file_path),exist_ok=True)FORMAT=('%(asctime)s%(levelname)s [%(name)s] [%(filename)s:%(lineno)d] ''[dd.service=%(dd.service)s dd.env=%(dd.env)s dd.version=%(dd.version)s dd.trace_id=%(dd.trace_id)s dd.span_id=%(dd.span_id)s] ''- %(message)s')logging.basicConfig(level=logging.DEBUG,filename=file_path,format=FORMAT,force=True)ddlogs=[]initialize(**{'statsd_port':8125})@ddtrace.tracer.wrap()@functions_framework.httpdefhello_http(request):log=request.args.get("log")statsd.increment("ninja.run.func.sent",tags=["runtime:python"])iflog!=None:withddtrace.tracer.trace('sending-test-logs')asspan:span.set_tag('logs','TEST')logging.debug(log)ddlogs.append(log)return"Welcome to Datadog!💜"
packagegcfv2;importjava.io.BufferedWriter;importjava.io.File;importjava.io.IOException;importcom.google.cloud.functions.HttpFunction;importcom.google.cloud.functions.HttpRequest;importcom.google.cloud.functions.HttpResponse;importcom.timgroup.statsd.NonBlockingStatsDClientBuilder;importcom.timgroup.statsd.StatsDClient;importorg.apache.logging.log4j.LogManager;importorg.apache.logging.log4j.Logger;publicclassHelloworldApplicationimplementsHttpFunction{privatestaticfinalStatsDClientStatsd=newNonBlockingStatsDClientBuilder().hostname("localhost").build();protectedstaticfinalLoggerlogger4=LogManager.getLogger();publicstaticvoidcreateLogFile(){Filedirectory=newFile("shared-volume/logs");if(!directory.exists()){directory.mkdirs();// Create directory if it doesn't exist}else{try{FilelogFile=newFile("shared-volume/logs/app.log");if(!logFile.exists()){logFile.createNewFile();}}catch(IOExceptione){e.printStackTrace();}}}publicvoidservice(finalHttpRequestrequest,finalHttpResponseresponse)throwsException{createLogFile();Statsd.incrementCounter("ninja.run.func.sent");finalBufferedWriterwriter=response.getWriter();logger4.info("Hello GCP!");writer.write("Hello Datadog!!");}}
<?xml version="1.0" encoding="UTF-8"?><projectxmlns="http://maven.apache.org/POM/4.0.0"xmlns:xsi="http://www.w3.org/2001/XMLSchema-instance"xsi:schemaLocation="http://maven.apache.org/POM/4.0.0 http://maven.apache.org/xsd/maven-4.0.0.xsd"><modelVersion>4.0.0</modelVersion><groupId>functions</groupId><artifactId>functions-hello-world</artifactId><version>1.0.0-SNAPSHOT</version><dependencyManagement><dependencies><dependency><artifactId>libraries-bom</artifactId><groupId>com.google.cloud</groupId><scope>import</scope><type>pom</type><version>26.32.0</version></dependency></dependencies></dependencyManagement><properties><maven.compiler.target>17</maven.compiler.target><maven.compiler.source>17</maven.compiler.source></properties><dependencies><!-- Required for Function primitives --><dependency><groupId>com.google.cloud.functions</groupId><artifactId>functions-framework-api</artifactId><version>1.1.4</version></dependency><dependency><groupId>com.google.cloud.functions.invoker</groupId><artifactId>java-function-invoker</artifactId><version>1.4.0</version></dependency><dependency><groupId>com.datadoghq</groupId><artifactId>java-dogstatsd-client</artifactId><version>4.4.3</version></dependency><dependency><groupId>org.apache.logging.log4j</groupId><artifactId>log4j-api</artifactId><version>2.19.0</version></dependency><dependency><groupId>org.apache.logging.log4j</groupId><artifactId>log4j-core</artifactId><version>2.19.0</version></dependency></dependencies><build><plugins><plugin><!--
Google Cloud Functions Framework Maven plugin
This plugin allows you to run Cloud Functions Java code
locally. Use the following terminal command to run a
given function locally:
mvn function:run -Drun.functionTarget=your.package.yourFunction
--><groupId>com.google.cloud.functions</groupId><artifactId>function-maven-plugin</artifactId><version>0.11.0</version><configuration><functionTarget>functions.HelloWorld</functionTarget></configuration></plugin><plugin><groupId>org.apache.maven.plugins</groupId><artifactId>maven-shade-plugin</artifactId><version>3.2.4</version><executions><execution><phase>package</phase><goals><goal>shade</goal></goals></execution></executions></plugin></plugins></build></project>
# Download Datadog Java AgentFROM maven:3.8.3-openjdk-17 AS build# Set working directoryWORKDIR /# Download the required Maven dependencyRUN mvn dependency:get -Dartifact=com.google.cloud.functions.invoker:java-function-invoker:1.4.0 \
&& mvn dependency:copy -Dartifact=com.google.cloud.functions.invoker:java-function-invoker:1.4.0 -DoutputDirectory=/FROM openjdk:17-jdk# Set the working directory in the containerWORKDIR /ADD'https://dtdg.co/latest-java-tracer' dd-java-agent.jarCOPY --from=build java-function-invoker-1.4.0.jar java-function-invoker.jar# Copy the JAR file into the containerCOPY target/functions-hello-world-1.0.0-SNAPSHOT.jar helloworld.jarENVJAVA_OPTS=-javaagent:dd-java-agent.jar# Expose the port (Cloud Run automatically assigns the actual port via $PORT)ENVPORT=8080EXPOSE 8080 8125/udpENTRYPOINT["java","-javaagent:/dd-java-agent.jar","-jar","/java-function-invoker.jar","--classpath","/helloworld.jar","--target","functions.HelloWorld"]
packagehelloworldimport("fmt""github.com/sirupsen/logrus"dd_logrus"gopkg.in/DataDog/dd-trace-go.v1/contrib/sirupsen/logrus""html/template""net/http""os""path/filepath""github.com/DataDog/datadog-go/v5/statsd""github.com/GoogleCloudPlatform/functions-framework-go/functions""gopkg.in/DataDog/dd-trace-go.v1/ddtrace""gopkg.in/DataDog/dd-trace-go.v1/ddtrace/tracer")constlogDir="/shared-volume/logs"varddlogs[]stringvarlogFile*os.FilevarlogCounterintvardogstatsdClient*statsd.ClientconsthomeTemplate=`
<!DOCTYPE html>
<head>
<meta charset="UTF-8">
<meta http-equiv="X-UA-Compatible" content="IE=edge">
<meta name="viewport" content="width=device-width, initial-scale=1.0">
<title>Datadog Test</title>
</head>
<body>
<h1>Welcome to Datadog!💜</h1>
<form action="" method="get">
<input type="text" name="log" placeholder="Enter Log">
<button>Add Log</button>
</form>
<h3>Logs Sent to Datadog:</h3>
<ul>
{{range.}} <li>{{.}}</li>
{{end}} </ul>
</body>
`funcinit(){logrus.SetFormatter(&logrus.JSONFormatter{})logrus.AddHook(&dd_logrus.DDContextLogHook{})err:=os.MkdirAll(logDir,0755)iferr!=nil{panic(err)}logFilePath:=filepath.Join(logDir,"maincontainer.log")logrus.Println("Saving logs in ",logFilePath)logFileLocal,err:=os.OpenFile(logFilePath,os.O_WRONLY|os.O_APPEND|os.O_CREATE,0644)iferr!=nil{panic(err)}logrus.SetOutput(logFileLocal)logFile=logFileLocallogrus.Print("Main container started...")dogstatsdClient,err=statsd.New("127.0.0.1:8125")iferr!=nil{panic(err)}tracer.Start()functions.HTTP("HelloHTTP",helloHTTP)}// helloHTTP is an HTTP Cloud Function with a request parameter.
funchelloHTTP(whttp.ResponseWriter,r*http.Request){span:=tracer.StartSpan("maincontainer",tracer.ResourceName("/helloHTTP"))logrus.Printf("Yay!! Main container works %v",span)err:=dogstatsdClient.Incr("ninja.run.func.sent",[]string{"runtime:go"},1)iferr!=nil{logrus.Error("Error incrementing counter:",err)}deferspan.Finish()sent_log:=r.URL.Query().Get("log")ifsent_log!=""{logCounter++writeLogsToFile(fmt.Sprintf("received request %d",logCounter),span.Context())writeLogsToFile(sent_log,span.Context())ddlogs=append(ddlogs,sent_log)}tmpl,err:=template.New("home").Parse(homeTemplate)iferr!=nil{logrus.Error("Error parsing template:",err)}tmpl.Execute(w,ddlogs)}funcwriteLogsToFile(log_msgstring,contextddtrace.SpanContext){span:=tracer.StartSpan("writeLogToFile",tracer.ResourceName("/writeLogsToFile"),tracer.ChildOf(context))deferspan.Finish()_,err:=logFile.WriteString(log_msg+"\n")iferr!=nil{logrus.Println("Error writing to log file:",err)}}
usingGoogle.Cloud.Functions.Framework;usingStatsdClient;usingMicrosoft.AspNetCore.Http;usingMicrosoft.Extensions.Logging;usingSystem.IO;usingSystem.Text.Json;usingSystem.Threading.Tasks;usingDatadog.Trace;usingSerilog;usingSerilog.Formatting.Compact;usingSerilog.Sinks.File;namespaceHelloHttp;publicclassFunction:IHttpFunction{publicDogStatsdService_dsd;publicFunction(){vardogstatsdConfig=newStatsdConfig{StatsdServerName="127.0.0.1",StatsdPort=8125,};_dsd=newDogStatsdService();_dsd.Configure(dogstatsdConfig);stringdirectoryPath="/shared-volume/logs";stringfilePath=Path.Combine(directoryPath,"app.log");// Create the directory if it doesn't existif(!Directory.Exists(directoryPath)){Directory.CreateDirectory(directoryPath);}// Create a file if it doesn't existif(!File.Exists(filePath)){File.WriteAllText(filePath,"Hello, this is the content of the file.");}Log.Logger=newLoggerConfiguration().WriteTo.File(newRenderedCompactJsonFormatter(),"/shared-volume/logs/app.log").CreateLogger();}publicasyncTaskHandleAsync(HttpContextcontext){using(varscope=Tracer.Instance.StartActive("test-function-dotnet")){_dsd.Increment("ninja.run.func.sent",tags:new[]{"runtime:dotnet"});Log.Information("Hello Datadog Cloud Run Functions! 💜");awaitcontext.Response.WriteAsync("Hello Datadog Cloud Run Functions! 💜");}}}