Client and Server for Nethesis partner meeting badges 2017
The software can be run both with docker-compose in rootfull mode and podman-compose in rootless mode.
Podman compose will run the containers in rootless mode, so you need to have podman and podman-compose installed.
To run with podman-compose:
podman-compose up -dTo stop and remove containers:
podman-compose downDocker will run the containers in rootfull mode, so you need to have docker and docker-compose installed.
To run with docker-compose:
docker-compose up -dTo stop and remove containers:
docker-compose downIf the database does not run, you may need to add the following inside the docker-composer:
ulimits:
nofile: 1048576
Note: this will not work with podman-compose.
The compose will expose the following service:
- Web UI: http://localhost:8888, websocket port: 35729
- Server: http://localhost:8080
- PHPMyAdmin: http://localhost:8081
- Printer driver: http://
This project includes several helper scripts in the script/ folder. Below are the current defaults and important behaviors added recently.
-
script/csv_loader.py- Purpose: convert a normalized attendees CSV into a
.sqlfile of INSERTs for theiscrittitable. - New default behavior: when no input filename is supplied the script will use
./nethcheckin.csvand write./nethcheckin.sql. - Usage examples:
- Default file:
python3 script/csv_loader.py(readsnethcheckin.csv) - Custom input/output:
python3 script/csv_loader.py path/to/input.csv custom_output.sql
- Default file:
- Notes: The script writes
TRUNCATE TABLE iscritti;followed by INSERTs. Back up your DB if needed.
- Purpose: convert a normalized attendees CSV into a
-
script/eventbrite_hubspot_merge.py- Purpose: merge an Eventbrite export and a HubSpot contacts export into the normalized CSV expected by
csv_loader.py. - Defaults:
--eventbritedefaults toeventbrite.csv,--hubspotdefaults tohubspot.csv, and--outdefaults tonethcheckin.csvwhen omitted. - Sala remapping rules applied automatically (examples from the most recent event):
Sala B (ore 14.30 NethSecurity8, ore 15.30 NS8+NethService, ore 16.30 NethVoice)->Sala Castello 1Sala C (ore 14.30 NS8+NethService, ore 15.30 NethVoice, ore 16.30 NethSecurity8)->Sala Castello 2Sala D (ore 14.30 NethVoice, ore 15.30 NethSecurity8, ore 16.30 NS8+NethService)->Sala Arco- If the Eventbrite field
Parteciperò alla sessione pomeridiana | 10 ottobreis set to the same string, the script maps the attendee toSala Piazza.
- The script will attempt to set
tipotoProspectwhen the attendee email exists in the HubSpot export and the HubSpotTipo Lead ACproperty equalsProspect. OtherwisetipoisPartner. - The script logs and prints the number of attendees that ended up without a
sala(useful for manual cleanup).
- Purpose: merge an Eventbrite export and a HubSpot contacts export into the normalized CSV expected by
-
script/codereadr_push.py- Purpose: convert the normalized CSV into the CodeREADr import format and upload it to a CodeREADr database.
- Defaults and behavior: when no
--outis provided the script writes a temp file. It clears the remote database by default (use--no-clear), and supports--dry-runand--verbose. - Provide credentials either via
--api-key/--database-idor via environment variablesCODEREADR_API_KEYandCODEREADR_DATABASE_ID.
A convenience bash script is provided to run the common sequence using the default filenames (placed in the project root):
- Eventbrite export:
eventbrite.csv(this should be exported manually from Eventbrite) - HubSpot export:
hubspot.csv(this should be exported manually from HubSpot) - Merged normalized CSV produced:
nethcheckin.csv - SQL produced:
nethcheckin.sql
First, setup CodeREADr credentials as environment variables:
export CODEREADR_API_KEY=your_api_key_here
export CODEREADR_DATABASE_ID=your_database_id_hereThe script is: script/run_pipeline.sh and performs these steps:
- Merge Eventbrite + HubSpot into
nethcheckin.csvusingeventbrite_hubspot_merge.py - Generate
nethcheckin.sqlfromnethcheckin.csvusingcsv_loader.py. - Automatically upload IDs to CodeREADr using
codereadr_push.py. - Load the generated SQL into the running database container if available, otherwise print the command to run manually.
Run the pipeline from the project root:
script/run_pipeline.shManual steps and safety notes:
- Verify the source CSVs are present at the default paths before running:
eventbrite.csvandhubspot.csv. - The pipeline will generate
${OUT_CSV%.*}.sql(usuallynethcheckin.sql). The SQL includes aTRUNCATE TABLE iscritti;line — back up your database before importing if needed. - After SQL generation the script will attempt to import the
.sqlinto a running container namedneth-check-in_db_1automatically. Priority for container runtimes is:podman(preferred),odman, thendocker. - The CodeREADr upload will clear the remote database before importing new data.