Faasera Documentation

Faasera Deployment Guide

This guide explains how to deploy Faasera components across various environments, including serverless cloud platforms, SDK-based pipelines, and the full visual UI stack. It provides deployment patterns, prerequisites, and operational best practices.


Deployment Modes Overview

Mode Description Ideal For
Cloud Functions Lightweight masking, profiling, and validation via serverless functions Event-driven processing, microservices
SDK Integration Java / PySpark SDK for Spark and Databricks environments Batch/streaming data pipelines
REST API Host the Faasera engine with REST endpoints CI/CD automation, centralized control
Full UI Platform Visual orchestration and management platform End-to-end compliance operations
Plugin Mode Use with ETL tools like NiFi, ADF, Airflow Drag-and-drop pipeline integration

Cloud Function Deployment

Supported Platforms:

Requirements:

Example: AWS Lambda


# Deploy via AWS CLI
aws lambda create-function   --function-name FaaseraFunction   --runtime java11   --handler ai.faasera.lambda.
FaaseraHandler   --zip-file fileb://build/libs/faasera.<build>.jar   --role 
arn:aws:iam::<account>:role/FaaseraLambdaRole

SDK Deployment (Java / PySpark)

Use Cases:

Java SDK

PySpark SDK


REST API Deployment

Quickstart:

java -jar faasera-api-server.jar

Full Platform UI Deployment

Architecture:

Quickstart (Docker Compose):

docker-compose up -d

Plugin Deployment

Available Plugins:

Usage:


Licensing & Security


Best Practices

Area Recommendation
Scalability Use serverless for burst loads, SDKs for large volumes
Monitoring Enable Prometheus/Grafana or use logs with ELK/Splunk
Compliance Store policies in version-controlled repo (e.g., Git)
Auditing Retain validation logs and masking reports for 12+ months
Security Rotate encryption keys and API tokens regularly