-
-
Notifications
You must be signed in to change notification settings - Fork 3k
436 lines (374 loc) Β· 17.3 KB
/
export.yml
File metadata and controls
436 lines (374 loc) Β· 17.3 KB
1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
18
19
20
21
22
23
24
25
26
27
28
29
30
31
32
33
34
35
36
37
38
39
40
41
42
43
44
45
46
47
48
49
50
51
52
53
54
55
56
57
58
59
60
61
62
63
64
65
66
67
68
69
70
71
72
73
74
75
76
77
78
79
80
81
82
83
84
85
86
87
88
89
90
91
92
93
94
95
96
97
98
99
100
101
102
103
104
105
106
107
108
109
110
111
112
113
114
115
116
117
118
119
120
121
122
123
124
125
126
127
128
129
130
131
132
133
134
135
136
137
138
139
140
141
142
143
144
145
146
147
148
149
150
151
152
153
154
155
156
157
158
159
160
161
162
163
164
165
166
167
168
169
170
171
172
173
174
175
176
177
178
179
180
181
182
183
184
185
186
187
188
189
190
191
192
193
194
195
196
197
198
199
200
201
202
203
204
205
206
207
208
209
210
211
212
213
214
215
216
217
218
219
220
221
222
223
224
225
226
227
228
229
230
231
232
233
234
235
236
237
238
239
240
241
242
243
244
245
246
247
248
249
250
251
252
253
254
255
256
257
258
259
260
261
262
263
264
265
266
267
268
269
270
271
272
273
274
275
276
277
278
279
280
281
282
283
284
285
286
287
288
289
290
291
292
293
294
295
296
297
298
299
300
301
302
303
304
305
306
307
308
309
310
311
312
313
314
315
316
317
318
319
320
321
322
323
324
325
326
327
328
329
330
331
332
333
334
335
336
337
338
339
340
341
342
343
344
345
346
347
348
349
350
351
352
353
354
355
356
357
358
359
360
361
362
363
364
365
366
367
368
369
370
371
372
373
374
375
376
377
378
379
380
381
382
383
384
385
386
387
388
389
390
391
392
393
394
395
396
397
398
399
400
401
402
403
404
405
406
407
408
409
410
411
412
413
414
415
416
417
418
419
420
421
422
423
424
425
426
427
428
429
430
431
432
433
434
435
436
name: Export JSON/XML/YAML/CSV/MYSQL/PSQL/SQLITE/SQLSERVER/MONGODB
on:
push:
branches:
- master
paths-ignore:
- "**"
- "!bin/Commands/Export**"
workflow_dispatch:
inputs:
pass:
description: "Passcode"
required: true
jobs:
export:
name: JSON/XML/YAML/CSV/MYSQL/PSQL/SQLITE/SQLSERVER/MONGODB
runs-on: ubuntu-24.04
strategy:
matrix:
php-version: [8.2]
node-version: [20.x]
fail-fast: false
steps:
- name: Checkout
uses: actions/checkout@v4
with:
fetch-depth: 1
submodules: true
ref: ${{ github.head_ref }}
persist-credentials: true
lfs: false
- name: Setup PHP
uses: shivammathur/setup-php@v2
with:
php-version: ${{ matrix.php-version }}
extensions: intl #optional
coverage: none
ini-values: "post_max_size=256M, memory_limit=512M" #optional
- name: Use Node.js ${{ matrix.node-version }}
uses: actions/setup-node@v4
with:
node-version: ${{ matrix.node-version }}
cache: 'npm'
cache-dependency-path: nmig/package-lock.json
- name: Cache Composer dependencies
uses: actions/cache@v4
with:
path: bin/vendor
key: ${{ runner.os }}-composer-${{ hashFiles('bin/composer.lock') }}
restore-keys: ${{ runner.os }}-composer-
- name: Start MySQL service
run: |
sudo systemctl start mysql.service
mysql -V
# Wait for MySQL to be ready
while ! mysqladmin ping -h"127.0.0.1" --silent; do
echo "Waiting for MySQL..."
sleep 1
done
- name: Start PostgreSQL service
run: |
sudo systemctl start postgresql.service
pg_isready
pg_lsclusters
sudo -u postgres psql -c "CREATE DATABASE world;"
sudo -u postgres psql -c "ALTER USER postgres PASSWORD 'postgres';"
sudo -u postgres psql -c "\l"
- name: Setup MongoDB
uses: supercharge/mongodb-github-action@1.10.0
with:
mongodb-version: '6.0'
mongodb-replica-set: rs0
- name: Install MongoDB Database Tools
run: |
# Download MongoDB Database Tools directly
wget -q https://fastdl.mongodb.org/tools/db/mongodb-database-tools-ubuntu2204-x86_64-100.7.3.deb
sudo dpkg -i mongodb-database-tools-ubuntu2204-x86_64-100.7.3.deb
rm -rf mongodb-database-tools-ubuntu2204-x86_64-100.7.3.deb
# Verify installation
mongoimport --version
- name: Setup MySQL DB
run: |
echo "ποΈ Creating MySQL database..."
mysql -uroot -proot -e "CREATE DATABASE world CHARACTER SET utf8mb4 COLLATE utf8mb4_unicode_ci;"
mysql -uroot -proot -e "SHOW DATABASES;"
echo "β
Database created"
- name: Initialize Database Schema
run: |
echo "π Creating database tables..."
mysql -uroot -proot world < bin/db/schema.sql
echo "β
Schema initialized - verifying tables..."
mysql -uroot -proot -e "USE world; SHOW TABLES;"
mysql -uroot -proot -e "USE world; SHOW CREATE TABLE regions\G" | head -20
echo "β
All tables created successfully"
- name: Setup Python Dependencies
run: |
python -m pip install --upgrade pip
pip install mysql-connector-python
echo "β
Python dependencies installed"
- name: Import JSON to MySQL
run: |
echo "π₯ Importing contributions to MySQL..."
python3 bin/scripts/sync/import_json_to_mysql.py --host localhost --user root --password root --database world
echo "β
Import complete - IDs auto-assigned by MySQL"
- name: Export MySQL to JSON (Validation)
run: |
echo "π Syncing MySQL back to contributions JSON for validation..."
python3 bin/scripts/sync/sync_mysql_to_json.py --host localhost --user root --password root --database world
echo "β
MySQL β JSON sync complete"
echo "π Validating round-trip consistency..."
# Show what changed (if anything)
git diff --stat contributions/ || echo "No changes detected - perfect round-trip!"
- name: Setup & Run NMIG (MySQL to PostgreSQL)
run: |
cp nmig.config.json nmig/config/config.json
cd nmig
npm install
npm run build
npm start
cd ..
- name: Setup MySQLtoSQLite
run: |
python -m pip install --upgrade pip
pip install mysql-to-sqlite3
mysql2sqlite --version
- name: Setup variables
run: |
echo "region_count=$(mysql -uroot -proot -e 'SELECT COUNT(*) FROM world.regions;' -s)" >> $GITHUB_ENV
echo "subregion_count=$(mysql -uroot -proot -e 'SELECT COUNT(*) FROM world.subregions;' -s)" >> $GITHUB_ENV
echo "country_count=$(mysql -uroot -proot -e 'SELECT COUNT(*) FROM world.countries;' -s)" >> $GITHUB_ENV
echo "state_count=$(mysql -uroot -proot -e 'SELECT COUNT(*) FROM world.states;' -s)" >> $GITHUB_ENV
echo "city_count=$(mysql -uroot -proot -e 'SELECT COUNT(*) FROM world.cities;' -s)" >> $GITHUB_ENV
echo "current_date=$(date +'%dth %b %Y')" >> $GITHUB_ENV
- name: Composer Dependencies
working-directory: ./bin
run: |
composer install
php console list
- name: Export JSON
working-directory: ./bin
run: php console export:json
- name: Convert JSON to GeoJSON
working-directory: ./bin
run: |
python3 scripts/export/json_to_geojson.py
- name: Convert JSON to Toon Format
working-directory: ./bin
run: |
python3 scripts/export/json_to_toon.py
- name: Export XML
working-directory: ./bin
run: php console export:xml
- name: Export YAML
working-directory: ./bin
run: php console export:yaml
- name: Export CSV
working-directory: ./bin
run: php console export:csv
- name: Export MySQL SQL
run: |
mkdir -p sql
mysqldump -uroot -proot --single-transaction --add-drop-table --disable-keys --set-charset --skip-add-locks world regions > sql/regions.sql
mysqldump -uroot -proot --single-transaction --add-drop-table --disable-keys --set-charset --skip-add-locks world subregions > sql/subregions.sql
mysqldump -uroot -proot --single-transaction --add-drop-table --disable-keys --set-charset --skip-add-locks world countries > sql/countries.sql
mysqldump -uroot -proot --single-transaction --add-drop-table --disable-keys --set-charset --skip-add-locks world states > sql/states.sql
mysqldump -uroot -proot --single-transaction --add-drop-table --disable-keys --set-charset --skip-add-locks world cities > sql/cities.sql
# Export complete world database
mysqldump -uroot -proot --single-transaction --add-drop-table --disable-keys --set-charset --skip-add-locks world > sql/world.sql
- name: Add DROP TABLE commands to world.sql
run: |
echo "π§Ή Adding DROP TABLE commands in correct order..."
# Remove any existing DROP TABLE commands
grep -v "DROP TABLE" sql/world.sql > sql/world_temp.sql
# Create new file with DROP TABLE commands in correct order (reverse of foreign key dependencies)
cat > sql/world.sql << 'EOF'
-- -------------------------------------------------------------
-- Generated from MySQL database
-- Combined world database
-- -------------------------------------------------------------
DROP TABLE IF EXISTS `cities`;
DROP TABLE IF EXISTS `states`;
DROP TABLE IF EXISTS `countries`;
DROP TABLE IF EXISTS `subregions`;
DROP TABLE IF EXISTS `regions`;
EOF
# Append the dump without DROP TABLE commands
cat sql/world_temp.sql >> sql/world.sql
rm sql/world_temp.sql
echo "β
DROP TABLE commands added"
- name: Generate Schema Files
run: |
echo "π Generating schema files..."
# Export MySQL schema only (no data)
mysqldump -uroot -proot --no-data --single-transaction --add-drop-table world > sql/schema.sql
# Also export for other formats
mysqldump -uroot -proot --no-data --single-transaction --add-drop-table world > psql/schema.sql
mysqldump -uroot -proot --no-data --single-transaction --add-drop-table world > sqlserver/schema.sql
echo "β
Schema files generated"
- name: Export PostgreSQL SQL
env:
PGPASSWORD: postgres
run: |
mkdir -p psql
pg_dump --dbname=postgresql://postgres:postgres@localhost/world -Fp --inserts --clean --if-exists -t regions > psql/regions.sql
pg_dump --dbname=postgresql://postgres:postgres@localhost/world -Fp --inserts --clean --if-exists -t subregions > psql/subregions.sql
pg_dump --dbname=postgresql://postgres:postgres@localhost/world -Fp --inserts --clean --if-exists -t countries > psql/countries.sql
pg_dump --dbname=postgresql://postgres:postgres@localhost/world -Fp --inserts --clean --if-exists -t states > psql/states.sql
pg_dump --dbname=postgresql://postgres:postgres@localhost/world -Fp --inserts --clean --if-exists -t cities > psql/cities.sql
pg_dump --dbname=postgresql://postgres:postgres@localhost/world -Fp --inserts --clean --if-exists > psql/world.sql
- name: Export SQLite
run: |
# Clean up any existing SQLite files first
rm -rf sqlite/
mkdir -p sqlite
mysql2sqlite -d world -t regions --mysql-password root -u root -f sqlite/regions.sqlite3
mysql2sqlite -d world -t subregions --mysql-password root -u root -f sqlite/subregions.sqlite3
mysql2sqlite -d world -t countries --mysql-password root -u root -f sqlite/countries.sqlite3
mysql2sqlite -d world -t states --mysql-password root -u root -f sqlite/states.sqlite3
mysql2sqlite -d world -t cities --mysql-password root -u root -f sqlite/cities.sqlite3
mysql2sqlite -d world --mysql-password root -u root -f sqlite/world.sqlite3
- name: Export SQL Server
working-directory: ./bin
run: php console export:sql-server
- name: Export MongoDB
working-directory: ./bin
run: |
php console export:mongodb
ls -la ../mongodb
- name: Import MongoDB
working-directory: ./mongodb
run: |
# Wait for MongoDB to be ready
sleep 5
echo "Importing collections..."
mongoimport --host localhost:27017 --db world --collection regions --file regions.json --jsonArray
mongoimport --host localhost:27017 --db world --collection subregions --file subregions.json --jsonArray
mongoimport --host localhost:27017 --db world --collection countries --file countries.json --jsonArray
mongoimport --host localhost:27017 --db world --collection states --file states.json --jsonArray
mongoimport --host localhost:27017 --db world --collection cities --file cities.json --jsonArray
echo "Import completed"
# Create a MongoDB dump
mongodump --host localhost:27017 --db world --out mongodb-dump
# Compress the dump
tar -czvf world-mongodb-dump.tar.gz mongodb-dump
echo "MongoDB dump created at mongodb/world-mongodb-dump.tar.gz"
rm -rf mongodb-dump regions.json subregions.json countries.json states.json cities.json
- name: Compress All Large Files
run: |
echo "ποΈ Compressing all large files..."
echo "================================"
# JSON Files
echo "π Compressing JSON files..."
if [ -f json/countries+states+cities.json ]; then
gzip -9 -k -f json/countries+states+cities.json
echo " β json/countries+states+cities.json.gz"
fi
if [ -f json/cities.json ]; then
gzip -9 -k -f json/cities.json
echo " β json/cities.json.gz"
fi
# TOON Files
echo "π Compressing TOON files..."
if [ -f toon/cities.toon ]; then
gzip -9 -k -f toon/cities.toon
echo " β toon/cities.toon.gz"
fi
# GEOJSON Files
echo "π Compressing GEOJSON files..."
if [ -f geojson/cities.geojson ]; then
gzip -9 -k -f geojson/cities.geojson
echo " β geojson/cities.geojson.gz"
fi
# XML Files
echo "π Compressing XML files..."
if [ -f xml/cities.xml ]; then
gzip -9 -k -f xml/cities.xml
echo " β xml/cities.xml.gz"
fi
# YAML Files
echo "π Compressing YAML files..."
if [ -f yml/cities.yml ]; then
gzip -9 -k -f yml/cities.yml
echo " β yml/cities.yml.gz"
fi
# CSV Files
echo "π Compressing CSV files..."
if [ -f csv/translations.csv ]; then
gzip -9 -k -f csv/translations.csv
echo " β csv/translations.csv.gz"
fi
# MySQL SQL Files
echo "π Compressing MySQL SQL files..."
gzip -9 -k -f sql/world.sql
gzip -9 -k -f sql/cities.sql
echo " β sql/world.sql.gz"
echo " β sql/cities.sql.gz"
# PostgreSQL SQL Files
echo "π Compressing PostgreSQL SQL files..."
gzip -9 -k -f psql/world.sql
gzip -9 -k -f psql/cities.sql
echo " β psql/world.sql.gz"
echo " β psql/cities.sql.gz"
# SQLite Files
echo "π Compressing SQLite files..."
gzip -9 -k -f sqlite/world.sqlite3
gzip -9 -k -f sqlite/cities.sqlite3
echo " β sqlite/world.sqlite3.gz"
echo " β sqlite/cities.sqlite3.gz"
# SQL Server Files
echo "π Compressing SQL Server files..."
if [ -f sqlserver/world.sql ]; then
gzip -9 -k -f sqlserver/world.sql
echo " β sqlserver/world.sql.gz"
fi
if [ -f sqlserver/cities.sql ]; then
gzip -9 -k -f sqlserver/cities.sql
echo " β sqlserver/cities.sql.gz"
fi
echo "================================"
echo "β
All compression complete!"
echo ""
echo "π Compressed file sizes:"
ls -lh json/*.gz toon/*.gz geojson/*.gz xml/*.gz yml/*.gz csv/*.gz sql/*.gz psql/*.gz sqlite/*.gz sqlserver/*.gz 2>/dev/null || true
- name: Update README.md
run: |
sed -i "s/Total Regions : [0-9]* <br>/Total Regions : $region_count <br>/" README.md
sed -i "s/Total Sub Regions : [0-9]* <br>/Total Sub Regions : $subregion_count <br>/" README.md
sed -i "s/Total Countries : [0-9]* <br>/Total Countries : $country_count <br>/" README.md
sed -i "s/Total States\/Regions\/Municipalities : [0-9]* <br>/Total States\/Regions\/Municipalities : $state_count <br>/" README.md
sed -i "s/Total Cities\/Towns\/Districts : [0-9]* <br>/Total Cities\/Towns\/Districts : $city_count <br>/" README.md
sed -i "s/Last Updated On : .*$/Last Updated On : $current_date/" README.md
- name: Create Pull Request
uses: peter-evans/create-pull-request@v7
with:
commit-message: |
π¦ Export database formats - ${{ env.current_date }}
β
All export formats completed successfully:
- JSON/TOON/GEOJSON/XML/YAML/CSV: Structured data exports
- MySQL/PostgreSQL: SQL dump exports
- SQLite: Database file exports
- SQL Server/MongoDB: Alternative format exports
π Total records: ${{ env.country_count }} countries, ${{ env.state_count }} states, ${{ env.city_count }} cities
committer: Darshan Gada <gadadarshan@gmail.com>
signoff: true
branch: export/Files
delete-branch: true
title: "π Database Export - ${{ env.current_date }}"
body: |
## π¦ Database Export Success
All export formats have been successfully generated.
### π Data Statistics
- **Regions**: ${{ env.region_count }}
- **Subregions**: ${{ env.subregion_count }}
- **Countries**: ${{ env.country_count }}
- **States**: ${{ env.state_count }}
- **Cities**: ${{ env.city_count }}
### β
Generated Formats
- **JSON** - Structured data format
- **TOON** - Structured data format
- **GEOJSON** - Structured data format
- **XML** - Markup language format
- **YAML** - Human-readable format
- **CSV** - Spreadsheet format
- **MySQL** - Database dumps
- **PostgreSQL** - Database dumps
- **SQLite** - Portable databases
- **SQL Server** - T-SQL scripts
- **MongoDB** - Collections + dump
---
*Generated automatically by GitHub Actions*
labels: |
exports
automated
reviewers: dr5hn