Suggestion to compartmentalize data #226
Replies: 7 comments 3 replies
-
Hi @MaheshkumarSundaram |
Beta Was this translation helpful? Give feedback.
-
No, it can be used anywhere as well
Sent from my iPhone
On Sep 11, 2024, at 1:10 PM, Maheshkumar Sundaram ***@***.***> wrote:
Sorry @ralphhanna<https://github.com/ralphhanna>. $item.data.template.. can be used anywhere no matter the type of BPMN element. But if I create a method getTemplate in appServices, I can only call this in a Service Task right?
Not sure whether I am understanding your point.
—
Reply to this email directly, view it on GitHub<#226 (reply in thread)>, or unsubscribe<https://github.com/notifications/unsubscribe-auth/AC2XVKEVRPX6B45WXCAYKJ3ZWB2N5AVCNFSM6AAAAABOBJXYVWVHI2DSMVQWIX3LMV43URDJONRXK43TNFXW4Q3PNVWWK3TUHMYTANRRG4YTGNY>.
You are receiving this because you were mentioned.Message ID: ***@***.***>
|
Beta Was this translation helpful? Give feedback.
-
Hi @ralphhanna, In relation to the above usecase, I ran into another issue. In a Service task named async VarsCheck(input, context) {
console.log('Task:', context.item.name);
for (const [key, value] of Object.entries(input)) {
console.log(key, ': ', value);
}
}
async GetTemplate(item, docVarName) {
const instId = item.token.execution.instance.id;
try {
if (!isAlphanumericString(docVarName)) {
throw new Error("GetTemplate's document name can only contain numbers, letters and underscores.");
}
const template = await fetchWfDataFragmentTemplate(instId, docVarName);
console.log('template:', template);
return template;
} catch (error) {
return { bpmnError: error.message };
}
} The |
Beta Was this translation helpful? Give feedback.
-
Hi @MaheshkumarSundaram But since you are using it in serviceTask why not include the fetch logic in the serviceTask instead of the input input:
|
Beta Was this translation helpful? Give feedback.
-
For Service Tasks, yes, definitely, I could integrate the fetch logic into the implementation. The issue would be in other types of tasks. Say User Task or a Receive Task input/output variables? Is there any work around for them? |
Beta Was this translation helpful? Give feedback.
-
I am working in a new release to allow all scripts to execute async functions including input/output variables |
Beta Was this translation helpful? Give feedback.
-
But for now, you can have a start trigger to load the data into item.data.templates and input value will just access them |
Beta Was this translation helpful? Give feedback.
-
Hi @ralphhanna,
I ran into a critical issue due to my below usecase.
One of my Service Tasks queries data files stored from MySQL (external db; I still use Mongo for
bpmn-web
) into the workflow instance. This is done to keep the data files unchanged throughout the span of the workflow instance.Basically, the output of the service would be to inject the below sample structure into the instance data:
In the above,
fileContent
can of typejson
,ini
,text
,md
orhtml
. The size of these file contents may range from 2MB to 5MB. I plan to use it like$item.data.templates.<fileName>
in subsequent tasks.But this causes the below error in the web server which makes sense due to the sheer amount of data in addition to all the other instance records for the Mongo query:
I can't directly query MySQL everytime as the data are subject to change and I would like to keep those data files same throughout the lifespan of a workflow instance.
I am still exploring options but I am listing these two here:
Option 1:
I am thinking to create a separate collection called
wf_data_templates
with an indexinstId
which is theid
ofwf_instances
. But then accessing like$item.data.templates.<fileName>
in the workflow would fail as they aren't native to the collection. But this complicates everything I suppose.Option 2:
Write the file contents to the webserver's disk catgeorized into direcotries (directory name - instance id) and keep just the path in the Mongo's instance data. But then if there are 100 workflows ongoing, there can be many such data fetching service tasks in one workflow. May be lead to file write or access errors as well. Cleaning up these files when a workflow completes would be a hassle too.
I did think of Redis but it can't be used reliably as my OS is Windows.
Option 2 with file system could work if there is a reliable way to delete the folder when a workflow instance completes. Is there any way to call a Service Task automatically under the hood when an instance completes?
I am struggling to find an effective solution to resolve this!!!
Just thought to ask you whether you could suggest any feasible solution to deal with my usecase!?
Any help would be much appreciated. Thank you so much.
Beta Was this translation helpful? Give feedback.
All reactions