Automated DDL/DML scripts export

As part of our modeling and data pipeline processes I need to publish DML/DDL SQL scripts automatically, say once per day.

My model export pipeline is based on PostgresSQL DML/DDL, I wrote a tool which parses these files and generate dynamically DML/DDL for all databases which we need to support:

  • Hive
  • Huawei Cloud Data Lake Insight - Carbon Data layer
  • Huawei Cloud Data Lake Insight - Hive layer - not support all data types, bound to specific fork
  • Huawei DWS - Scalable Postgres db with specific language dialect

Yeah, its quite complex :slight_smile:

But I need some API in VP which I can call and will give me all SQL statements for all SELECTED diagrams in selected SQL database language.

Would be really useful, so I can release automatically all scripts from modeling tool into GIT for all users to consume.

Thank you for hints.

The functionality can be based on VP client, but maybe its better to introduce some RESP API compatible with Team Server which will provide such functionality, not sure what is better in terms of business feature from your perspective…

Ladislav

Hi Ladislav,

In scripts folder, there is a “GenerateORM” command line script can generate DDL for specific DB, does it help?

1 Like

Thx for link, via the java Class reference in script i found this page:

I will test what is the output. The idea was that I can provide collection of diagrams or entities.

Also in the client UI I have not only DDL, but DML option, and support for formatted output.

But let me try I will come back with what I think …

Ok, so the export works, but ti generates whole project into single DDL script. Now, during our evaulation period (we still have one month left) we were able to produce into model about 400 tables. We design multilayered big data storage (source, normalizing, denormalizing for fast query access).

To manage this efficiently I have different entities located in different ER diagrams, ER diagram si aggregation for specific layer (even layer can be split down vertically into multiple smaller layers).

Now, the “horizontal” layer specify the target technology (Hive, etc.), so the SQL language parser and conveting tool I wrote can identify what must be converted into what.

It will really help if I can either specify input diagram name (can be array) and/or generate DDL and DML files for specific diagram. Optionally this will create situation when some entities are not bound to any diagram (yet), these can go to something like “NON_DIAGRAM BOUNDED” scripts.

So in this case I will be able to process it in some meaningful way.

Additionally it will really help if I can for each diagram get multiple files as I can get via UI:

  • DDL Create
  • DDL Drop
  • DML Insert, Update, Delete, Select

So each diagram ends up in 6 files. + option if should be formated or not. :slight_smile:

This will really help by extending com.vp.cmd.GenerateORM class interface more variable.

NOTE: I can achieve similar goal by create mapping classes (Entity to Layer) outside of VP, but then I am asking myself: Why I aggregate entites by diagrams in VP when I must do the same externally…sure this apply for DDL only.

Thx for reply.

Ladislav

I think the resolution of API could be following to extend (to summarize above):

  • Diagram selection (set of entities)
  • Entity selection (very fine grained)
  • DDL scripts (DROP and/or CREATE)
  • DML script templates (SELECT, INSERT, UPDATE, DELETE)
  • Database technology type (Oracle, Postgres, Hive, etc.)

Ladislav