As of my current understanding, Moqui Framework handles auto-incrementing primary keys internally through the entity facade, and it does not provide an option for auto-increment within the database table creation process.
I would like to achieve functionality where auto-increment is applied even when inserting data directly into the database via SQL statements, in addition to Moqui managing it automatically when needed.
One possible approach I’m considering is modifying the database table to include an auto-increment constraint. Could you please advise on how to configure or implement this to ensure consistency in auto-increment behaviour?
I don’t wanna put auto-increment into database tables manually, I’m fine if Moqui does it.
moqui has automatic APIs to handle updates to the database, and you could also add new ways if some specific handling is needed. Do you have some use case where this is not feasible?
My use case involves using Moqui solely for creating entities and reading data from them. For data insertion, I plan to use NiFi to directly insert data into these entities by reading from CSV files.
My suggestion would be to use the moqui API to insert the data. This means that you would create a userAccount on moqui with an apikey, give it the needed authorizations and call the REST API from NiFi if that is possible.
This allows to make use of the moqui features like SECAs, EECAs and handling the PKs, so if you need to make changes to those entities within moqui it would be straight forward.
It this is not possible, you could insert the data directly, taking into account that any EECA in place would not be triggered in moqui, and if you need to insert data within moqui you would need to specify PKs completely instead of relying on the auto-increment within moqui.
If you want moqui to understand the auto-increment within the DB, I think that would be possible, but require modifications to the framework.
Yes, we can make API calls using NiFi, but a simpler solution we considered is to create a CSV export of the entire dataset and directly insert it into the database using Moqui, rather than handling each insert individually by doing an API call from Nifi.
I was just seeking confirmation to see if there might be a functionality that allows setting auto-increment at the database level.
As far as I know, there is no functionality that allows setting auto-increment at the database level.
You can send the CSV to a moqui service, either via REST API or manually through a screen, and handle the insertions there. In that case, the PKs would be handled as usual within moqui. Normally this would require implementing a service that handles the CSV parsing.
Well, think of it simply as a messaging queue. If you’re coming through nifi it might mean you’re doing huge transactions or large volumes. So instead of chocking the system and having issues with sequence generation, you might simply receive the nifi data pipeline and push it to SystemMessage entity, from there you can start processing it in a sequential matter until it is emptied using the tips provided by @jenshp. The problem of sequence generation is already solved in moqui and the entity facade can generate whatever sequence you like. I think we might even have some higher level services for that.
To load in CSV into Moqui easily there’s two main options I can think of (that don’t use the SystemMesage [I would only use the SystemMessage if you’re having troubles loading a lot of data all the time])
If you only need to load the data once or at startup you can load the data into a CSV file for each table used reference you can then set the dataType.
If you need to load data constantly you can use call the /qapps/tools/Entity/DataImport screen’s load transition in csv mode. If you need to modify it or run similar code from a service, you can use the underlying EntityDataLoader for parsing and importing the csv files.
I’m working on the daily data sync between two DBs, the source frequently generates the CSVs. Then SystemMessage comes and polls for the data files and then consume these CSV files with the help of EntityDataLoader as you said.
I’m working on ETL data flows to build a data warehouse to simplify our analytics and reporting capability.
We are delivering an Analytics platform powered by Apache superset along with our product, but It is hard to manage so many datasets in the form of SQL queries and they put so much load on the transactional DB.
So, We’ve decided to design an OLAP kind of DB, Here comes the ETL to sync data between two different database schemas (one following normalisation and the other one following the fact & dimension ideology).
Can you provide more details about the Moqui data sync, Is this doing the same as "polling for the files to the server (if possible SFTP would be nice else will find the way) and loading into the specific entity?