Bots With Spin and Fermyon Cloud
Mikkel Mørk Hegnhøj
spin
cloud
wasm
webassembly
rust
bot
microservices
slack
We’ve been seeing hundreds of users trying out the Fermyon Cloud, and we’ve been busy playing around with a variety of use cases ourselves. This blog post showcases using Spin and Fermyon Cloud to host a Slack bot.
Bots - An Easy Use Case With Spin and the Fermyon Cloud
Since the release of Spin v0.6.0 and the Fermyon Cloud on October 24th, we’ve been seeing a lot of users trying out the Fermyon Cloud, and we’ve been busy playing around with a variety of use cases ourselves. We’ve right now spent every Friday just building applications with Spin and the Fermyon Cloud - those Fridays are fun.
Why are we doing this? Some would say we are eating our own dogfood, however, we don’t consider neither Spin nor the open beta of Fermyon Cloud to be that bad - it’s more like a nice quick snack, which is healthy, nurtures you, and brightens your day because it’s a lot of fun to develop with Spin. So while we wait for the full-course meal, we’ve cooked up a few fun use case which we want to share with you all.
Bots
As a company heavily using Slack, what’s more obvious than to populate our channels with a set of new colleagues in the form of Spin bots, powered by the Fermyon Cloud?
It all started with Fermybot3000, which quickly escalated into now having at least five bots roaming our Slack channels. You can get answers to anything from how many grapes we have (it’s a KubeCon thing…), to figuring out whether it’s Friday in the Fermyon Cloud. All good and useful features of a Slack channel 🤔
How Does a Spin Slack Bot Work?
The bot is hooked up to the Slash command feature of Slack, so you just have to write /friday
to figure out if it’s Friday in the Fermyon Cloud.
When writing the /friday
command in a Slack channel, an HTTP request gets send to an endpoint you define. In this case, we’re pointing Slack to our Spin application running in the Fermyon Cloud.
This particular bot is written in Rust, using the http-rust
Spin template. The full handler function looks like this:
#[http_component]
fn friday(req: Request) -> Result<Response> {
let slash_cmd: Option<SlackSlashCommand> = req
.body()
.as_deref()
.map(serde_urlencoded::from_bytes)
.transpose()?;
let username = match slash_cmd {
Some(s) => s.user_name,
_ => "".to_string(),
};
let now = Utc::now();
println!("{now:?} : Command: {slash_cmd:?}");
let resp = match now.weekday() {
Weekday::Fri => SlackSlashResponse {
response_type: ResponseType::InChannel,
text: format!("Hi {username}, It's Friday in the Fermyon Cloud 🥳").to_string(),
},
_ => SlackSlashResponse {
response_type: ResponseType::InChannel,
text: "It's not Friday in the Fermyon Cloud 🫤".to_string(),
},
};
let resp_bytes = serde_json::to_vec(&resp)?;
println!("Response: {}", String::from_utf8_lossy(&resp_bytes));
Ok(http::Response::builder()
.status(200)
.header("Content-Type", "application/json")
.body(Some(resp_bytes.into()))?)
}
Let’s dissect the code a little bit. We start by serializing the incoming request data to a custom type SlackSlashCommand
we’ve defined (a list of strings), and extract the username of the person who wrote the command in Slack.
let slash_cmd: Option<SlackSlashCommand> = req
.body()
.as_deref()
.map(serde_urlencoded::from_bytes)
.transpose()?;
let username = match slash_cmd {
Some(s) => s.user_name,
_ => "".to_string(),
};
We then figure out if it’s a Friday or not, and craft the appropiate response.
let now = Utc::now();
println!("{now:?} : Command: {slash_cmd:?}");
let resp = match now.weekday() {
Weekday::Fri => SlackSlashResponse {
response_type: ResponseType::InChannel,
text: format!("Hi {username}, It's Friday in the Fermyon Cloud 🥳").to_string(),
},
_ => SlackSlashResponse {
response_type: ResponseType::InChannel,
text: format!("Hi {username}, It's not Friday in the Fermyon Cloud 🫤").to_string(),
},
};
Since we’re a company working across 5 continents and 9 time zones, it’s almost always Friday somewhere, so we just rely on whether it’s Friday in the cloud.
Finally, the response is being put in to the body as a JSON object and replied back to Slack.
let resp_bytes = serde_json::to_vec(&resp)?;
println!("Response: {}", String::from_utf8_lossy(&resp_bytes));
Ok(http::Response::builder()
.status(200)
.header("Content-Type", "application/json")
.body(Some(resp_bytes.into()))?)
You can find the code in this repository.
Why Are Bots a Great Use Case for Fermyon Cloud?
With WebAssembly’s small size and fast start-up time, we’re able to cold-start a module on every execution - this is core to the Spin application model. No memory or CPU is being consumed by the Webassembly module, until there is a request to be handled. This is what usually makes cold starts a challenge, in that the first request to a server, will take the hit of the code not being initialized and loaded in memory. Many systems work around this issue, by keeping things running idle for a longer period to minimize the number of requests being subjected to the cold start penalty. With Spin, all requests are handled from a cold start, but it takes less than a millisecond to start a WebAssembly module in Spin, so it is not a problem to do this on every request.
In the case of a bot, which doesn’t have a traffic pattern of hundreds or thousands of requests per minute, we only use resources when a request comes in. This is truly a game-changer and makes it much easier to host these types of applications. You do not have to be concerned with the cost of having a bot sitting idle for most of the day, because it only uses the disk space of the server, which will eventually serve the request once called.
What’s Next?
We hope this gave you a bit of inspiration as to how you can benefit from the application and execution model of Spin, and host applications in the Fermyon Cloud.
Go take Spin for a spin - and try the Fermyon Cloud.
To stay up-to-date with Fermyon, please join our Discord server, or subscribe to our newsletters.