Development

Installation

How to set up and run Envision Portal locally for development.

Prerequisites

  • Node.js 22 (managed via mise)
  • pnpm
  • PostgreSQL database
  • Azure storage account with two containers: one for drafts, one for published datasets
  • SMTP email server

Optional for full-stack platform development and operations:

  • Docker for containerized development and deployment parity
  • Azure resources for compute and storage-backed deployment workflows

Setup

Clone the repository and install dependencies:

git clone <your-repo-url>
cd envision-portal
pnpm install

Copy the example environment file and fill in the required values:

cp .env.example .env

Run database migrations:

pnpm prisma:migrate:deploy

Start the development server:

pnpm dev

The app runs at http://localhost:3000.

Environment Variables

VariableDescription
DATABASE_URLPostgreSQL connection string
MAIL_HOSTSMTP host
MAIL_PORTSMTP port
MAIL_USERSMTP username
MAIL_PASSSMTP password
MAIL_FROMFrom address for outgoing emails
EMAIL_VERIFICATION_DOMAINBase URL used in verification email links
AZURE_DRAFT_*Azure Data Lake credentials for draft storage
AZURE_PUBLISHED_*Azure Data Lake credentials for published storage
NUXT_SITE_URLPublic URL of the application
NUXT_SITE_ENVEnvironment: development, staging, or production
EXTERNAL_API_KEYOptional key for external API integrations

Other Commands

CommandDescription
pnpm buildBuild for production
pnpm previewPreview the production build locally
pnpm prisma:migrate:deployApply pending database migrations
pnpm prisma:studioOpen Prisma Studio to browse the database

Docker

A Dockerfile is included in the project root. Build and run the container:

docker build -t envision-portal .
docker run -p 3000:3000 --env-file .env envision-portal

Deployment via Kamal is configured in the .kamal/ directory and config/deploy.yml.

Technical Stack Notes

The Envision Portal architecture is built around Nuxt and Nitro for web and API delivery, with Azure-backed storage for dataset files and containerized deployment workflows for maintainability.

Planned ecosystem components include:

  • Search indexing services to improve dataset discovery
  • Dedicated Python services for AI, validation, and data-processing workloads
  • Extended automation for external dataset indexing and metadata enrichment
Copyright © 2026