"# Example of using elementwise activation functions in the CUTLASS Python interface\n",
"This notebook walks through a basic example of using the CUTLASS Python interface to declare, compile, and run GEMMs with different epilogues.\n",
"\n",
"[](https://colab.research.google.com/github/NVIDIA/cutlass/tree/master/examples/00_basic_gemm.ipynb)"
]
},
{
"cell_type": "markdown",
"id": "3ca993fe",
"metadata": {},
"source": [
"We first import various packages needed for the example and construct the input and output tensors that will be used in our example."
]
},
{
"cell_type": "code",
"execution_count": null,
"id": "63a70a3c",
"metadata": {},
"outputs": [],
"source": [
"import numpy as np\n",
"\n",
"import cutlass\n",
"\n",
"# This controls whether ther C++ GEMM declaration will be printed at each step. Set to `false` to\n",
"## Run a GEMM with an identity activation function\n",
"To begin, we simply run a default GEMM with an identity activation function. This performs the well-known operation `D = alpha * (A @ B) + beta * C`. This is the default activation function used, and does not need to be specified."
"## Run a GEMM with a ReLU element-wise activation function\n",
"CUTLASS makes it easy to support other element-wise activation functions. This results in performing an element-wise after the generic linear combination performed in a GEMM. If we call such an activation function `act`, the resulting formulation is:\n",
"```\n",
"D = alpha * (A @ B) + beta * C\n",
"D = act(D)\n",
"```\n",
"\n",
"Here, we will add a ReLU activation function. Given an input `x`, ReLU returns `max(x, 0)`.\n",
"\n",
"This is easy to do in CUTLASS. One only needs to set the plan's `activation` field."
"CUTLASS supports a variety of widely-used element-wise activation functions. We can obtain a list of these functions via the `get_activations()` method."
"To add an activation with parameter such as `leaky_relu`, a tuple should be provided containing the activation function name and the (or a list of) parameter."