Actuarial model in the cloud

Welcome to the first post in our new series, "Cloud computing". In this series, we'll delve into the world of cloud computing and explore how it can be effectively utilized for actuarial purposes. In this post, our focus will be on running a cash flow model and saving the results in the cloud.


List of content:

  1. Introduction
  2. Prerequisities
  3. Solution

Introduction

Cloud computing delivers IT resources, such as servers, computing power, networking, and databases, over the internet on an "on-demand" basis. Leading cloud infrastructure providers include Amazon Web Services (AWS), Microsoft Azure and Google Cloud. For this guide, we'll use AWS and specifically focus on services available in the Free Tier.

Before we start, ensure you set up your Free Tier account on AWS. Remember to delete your AWS resources after following the tutorial to avoid unexpected costs.

Prerequisities

To run our cash flow model, we need to create two AWS resources:

  1. EC2 instance - based on Amazon Linux AMI from the Free Tier. Save the key-pair file for SSH connection.
  2. S3 bucket - this will store the results of our cash flow model.

Additionally, set up an Identity and Access Management (IAM) role to allow the EC2 instance to access the S3 bucket.

Create the IAM Role:

  1. Navigate to the IAM service.
  2. Create a new role, selecting "AWS service" as the type of trusted entity and "EC2" as the use case.
  3. Attach "AmazonS3FullAccess" policy.
  4. Give your role a meaningful name and description, then click "Create role".

Attach the IAM Role to the EC2 Instance:

  1. Navigate to EC2 in the AWS Management Console.
  2. Select your instance and go to Actions > Security > Modify IAM Role.
  3. Choose the IAM role you created and save changes.

Now your EC2 instance is ready to access the S3 bucket.

Solution

Now, let's dive into the solution. We'll first present the complete solution and then break it down step by step.

# utils.py
def exec_command(ssh, command):
    try:
        stdin, stdout, stderr = ssh.exec_command(command)

        stdout_content = stdout.read().decode("utf-8")
        if stdout_content:
            print(f"Stdout:\n{stdout_content}")

        stderr_content = stderr.read().decode("utf-8")
        if stderr_content:
            print(f"Stderr:\n{stderr_content}")
    except Exception as e:
        print(f"Exception:\n{str(e)}")

The exec_command function executes a given command using an SSH connection and prints out the standard output or error.


# main.py
import paramiko
from utils import exec_command

# Credentials
EC2_INSTANCE_IP = <your-ec2-instance-public-ip>
PRIVATE_KEY_PATH = <your-pem-file-path>
BUCKET_NAME = <your-bucket-name>

# Model information
HOST = "https://github.com/acturtle"  # replace with your host
REPOSITORY = "example"                # replace with your repository
MODEL = "mymodel"                     # replace with your model

# SSH connection
ssh = paramiko.SSHClient()
ssh.set_missing_host_key_policy(paramiko.AutoAddPolicy())
ssh.connect(EC2_INSTANCE_IP, username="ec2-user", key_filename=PRIVATE_KEY_PATH)

# Execute commands
commands = [
    "sudo yum install -y python3",
    "sudo yum install -y python3-pip",
    "pip install cashflower",
    "sudo yum install -y git",
    f"git clone {HOST + '/' + REPOSITORY}",
    f"cd {REPOSITORY + '/' + MODEL}; python3 run.py",
    f"cd {REPOSITORY + '/' + MODEL + '/output'}; aws s3 cp '.' s3://{BUCKET_NAME}/ --recursive",
    f"cd {REPOSITORY + '/' + MODEL + '/output'}; rm -r *",
]

for command in commands:
    exec_command(ssh, command)

ssh.close()

Steps:

  1. Credentials:

    Fill in EC2_INSTANCE_IP, PRIVATE_KEY_PATH and BUCKET_NAME with your data.

  2. Model information:

    Update HOST, REPOSITORY and MODEL variables based on your model and repository.

  3. SSH connection:

    Connect to the EC2 instance via SSH using the provided credentials.

  4. Execute commands:

    Run a series of commands to set up the environment, install necessary tools, clone the repository, run the model, copy results to the S3 bucket and clean up.

Congratulations! You have just run the cash flow model on a cloud! After completing the tutorial, remember to delete your AWS resources if you won't need them in the future.

Thanks for reading! If you have any questions or need assistance, feel free to ask in the comments. If you're interested in implementing cloud computing for your company, get in touch via the contact form.

Read also:

Log in to add your comment.