Discussion:
Read and delete rows in a File
(too old to reply)
Antonio Astorga
2019-11-11 14:16:54 UTC
Permalink
Hello
Im creating an AssemblyLine to read a File and to obtain the information stored there from another AssemblyLine.
Im using the ibmdi.FileSystem in Iterator Mode. But here i can read just one line per execution or the whole file.
Is there a way to read row by row, obtain the information and then delete row by row?
The file that i need to read contains information that help us to add roles or remove them from users, so everytime i process this i need to delete that information from that file.
Is it posible to do this? read row by row and delete lines using the FileManager conector?
Thank you for your help!!!
j***@gmail.com
2019-11-12 04:44:12 UTC
Permalink
Hi Antonio,
a file does not in general allow you to delete information in the file on a line by line basis. Typically you can only read the file, clear the file, and append to the file. There may be exceptions, of course, but typically you need to rewrite the entire file if you want to delete a line at the beginning of the file.

For your application you might consider using some other way of transmitting information from one AssemblyLine to the other, e.g. using a database table, where it is easy to add rows and delete rows, or maybe using a message queue, where one AssemblyLine sends a change and the other AssemblyLine reads the change.

If you want to go with the file, maybe the AssemblyLine that writes the information could always open the file in append mode, write one line, and then close the file. The AssemblyLine that reads the file could rename the file before it starts processing, and then it could process the entire renamed file, and delete it at the end of processing.
Antonio Astorga
2019-11-12 10:16:38 UTC
Permalink
Thank you very much for your Answer.
So keeping this in mind, we came with the idea of ​​using this file bacause our first Assembly line we program timeschedules where the system is not accepting changes, but we stored all the request that arrived in this period of time in a CVS File to then processed all of this requests once the timeschedule allows to do this.
is there something in TDI that works like a queue? where we can implement something like my previous explanation?
Thanks again for your help.
Eddie Hartman
2019-11-12 19:21:21 UTC
Permalink
Post by Antonio Astorga
Thank you very much for your Answer.
So keeping this in mind, we came with the idea of ​​using this file bacause our first Assembly line we program timeschedules where the system is not accepting changes, but we stored all the request that arrived in this period of time in a CVS File to then processed all of this requests once the timeschedule allows to do this.
is there something in TDI that works like a queue? where we can implement something like my previous explanation?
Thanks again for your help.
Nice idea, Antonio. You are implementing a Service Queue using your CSV file. You could try using the System Store Connector. It lets you store whole Entry objects on a key value (think HashMap to disk). So one process accepts requests (HTTP) or reads CSVs (maybe after checking and FTPing over new ones) to populate the Request DB. Then your scheduled task performs tasks and updates the Request DB accordingly. Perhaps removing Requests once they are successfully handled, or fail with an error that requires manual intervention. The System Store initially uses the bundled Apache Derby RDBMS, and you can point it at any JDBC compliant system (DB2, Oracle, SQLServer, ...).

There is also the System Queue, although I have less experience with this. It uses by default the bundled Apache MQ, and like the System Store can be replaced with a JMS compliant queueing system. As with many standard protocol-based Connectors, there is additionally an IBM version - i.e. MQ(JMS), SDS(LDAP) & Cloudant(MapReduce)

Hope this rambling helps!
/Eddie
Antonio Astorga
2019-11-13 16:59:27 UTC
Permalink
Thank you Eddie for the information!!

I will check this for the next implementation, just because we already have the CVS implemented and i have a short time to do this.

So, Im reading the whole file with the information stored there, 4 columns with information regarding (requestId, rol, userId and Operation)
To read this information i have a simple script that reads the variables stored in that file. The conector is in iterator mode.. and Im reading the whole file.

var usuario= work.UserID;
var requestId=work.RequestID;
var Operation= work.Operation;
var rolId= work.RolID;

But my file contains several rows with different users, rols, like this:
RQ150301;P_EPA_AUDITORES;47150790P;ns:WebServiceServer_thisNamespace#addRoleToUser
RQ150302;P_EPA_AUDITORES;47150790P;ns:WebServiceServer_thisNamespace#removeRoleFromUser

So i thought to save this information implementing an array like this

var Array = [requestId,rolId,usuario,Operation];
task.logmsg("Valor Operacion"+ Array);

But im getting somenting like this:


17:24:37,064 INFO - Valor OperacionRQ150301,P_EPA_AUDITORES,47150790P,ns:WebServiceServer_thisNamespace#addRoleToUser
17:24:37,064 INFO - Valor OperacionRQ150302,P_EPA_AUDITORES,47150790P,ns:WebServiceServer_thisNamespace#removeRoleFromUser

I need this becuase i need to erased the first row proccessed to empty the "queue"

Any ideas?

Thanks again for your help
Eddie Hartman
2019-11-14 09:24:11 UTC
Permalink
Post by Antonio Astorga
Thank you Eddie for the information!!
I will check this for the next implementation, just because we already have the CVS implemented and i have a short time to do this.
So, Im reading the whole file with the information stored there, 4 columns with information regarding (requestId, rol, userId and Operation)
To read this information i have a simple script that reads the variables stored in that file. The conector is in iterator mode.. and Im reading the whole file.
var usuario= work.UserID;
var requestId=work.RequestID;
var Operation= work.Operation;
var rolId= work.RolID;
RQ150301;P_EPA_AUDITORES;47150790P;ns:WebServiceServer_thisNamespace#addRoleToUser
RQ150302;P_EPA_AUDITORES;47150790P;ns:WebServiceServer_thisNamespace#removeRoleFromUser
So i thought to save this information implementing an array like this
var Array = [requestId,rolId,usuario,Operation];
task.logmsg("Valor Operacion"+ Array);
17:24:37,064 INFO - Valor OperacionRQ150301,P_EPA_AUDITORES,47150790P,ns:WebServiceServer_thisNamespace#addRoleToUser
17:24:37,064 INFO - Valor OperacionRQ150302,P_EPA_AUDITORES,47150790P,ns:WebServiceServer_thisNamespace#removeRoleFromUser
I need this becuase i need to erased the first row proccessed to empty the "queue"
Any ideas?
Thanks again for your help
If you want to store these in memory, Antonio, then using an Array is a good option. However, I would store whole objects to make your coding easier. Let me borrow from your code above:

// Global variable for the service queue
var _SVCQ = [];

// Function to add a request
function addToServiceQueue(entry) {
// Typically pass in the work Entry as 'entry'

// Now create a Javascript object to hold the values read from CSV
var request = {
usuario: entry.getString("UserID"), // use getString() to return the value
requestId: entry.getString("RequestID"),
operation: entry.getString("Operation"),
rolId: entry.getString("RolID")
}
_SVCQ.push(request);
}

This will give you a queue of requests, each an object with four properties: usuario, requestId, operation and rolld. Then you can have another function to return either the latest one pushed (LIFO) or the first one pushed (FIFO). LIFO is easiest as you just .pop() from the array. This function implements FIFO:

function getNextRequest() {
if (_SVCQ.length == 0) {
return null; // queue is empty
}

// Retrieve the oldest item in the queue. With LIFO you just .pop()
var request = _SVCQ[_SVCQ.length-1];

// remove the last item in the queue
_SVCQ.splice(_SVCQ.length-1, _SVCQ.length-1);
return request;
}

When you retrieve a request, you simply reference its properties:

var request = getNextRequest();
if (request != null) {
for (propertyName in request) {
task.logmsg(propertyName + ": " + request[propertyName]);
}
}

And if you find yourself need a library of functions, be sure to use Resources > Scripts in your SDI Project. You can then tell your AL which scripts to pre-load when starting up. Let me know if you want to know more :)

Hope this helps!

/Eddie
Antonio Astorga
2019-11-15 12:43:38 UTC
Permalink
Good morning Eddie,

Thank you very much for your help, I tried this code and of course it is working :)
So, when the system reads the whole file, everything is stored using the function addToServiceQueue.
The function getNextRequest retrieves the previous information stored.
So i have the the File with this information for example:
RQ150301;P_EPA_AUDITORES;47150790P;ns:WebServiceServer_thisNamespace#addRoleToUser
RQ150302;P_EPA_AUDITORES;47150791P;ns:WebServiceServer_thisNamespace#removeRoleFromUser
RQ150303;P_EPA_AUDITORES;47150792P;ns:WebServiceServer_thisNamespace#removeRoleFromUser

The system reads the whole file and we stored the information using the function addToService like this:

addToServiceQueue(work);


The idea we have in mind to implement using the CVS file, is to used this file likee a Queue, so once i read and processed the first line of the file, erased this line and continue with the procces until the file is empty like this:

RQ150302;P_EPA_AUDITORES;47150791P;ns:WebServiceServer_thisNamespace#removeRoleFromUser
RQ150303;P_EPA_AUDITORES;47150792P;ns:WebServiceServer_thisNamespace#removeRoleFromUser

Now we have only two lines to proccess, So in order to achieve this and correct me if Im wrong, I need to stored the whole information from the file , processed the first line, erased the file, retrieve the information using the method getNextRequest, store this information using the FileConnector without the information of the row proccesed request, right?
Is it posible to delete the whole file using this connector?
Any recomendation?
Thank you very much again for your help!! Im learning a lot with this :)
Eddie Hartman
2019-11-19 09:06:04 UTC
Permalink
Post by Antonio Astorga
Good morning Eddie,
Thank you very much for your help, I tried this code and of course it is working :)
So, when the system reads the whole file, everything is stored using the function addToServiceQueue.
The function getNextRequest retrieves the previous information stored.
RQ150301;P_EPA_AUDITORES;47150790P;ns:WebServiceServer_thisNamespace#addRoleToUser
RQ150302;P_EPA_AUDITORES;47150791P;ns:WebServiceServer_thisNamespace#removeRoleFromUser
RQ150303;P_EPA_AUDITORES;47150792P;ns:WebServiceServer_thisNamespace#removeRoleFromUser
addToServiceQueue(work);
RQ150302;P_EPA_AUDITORES;47150791P;ns:WebServiceServer_thisNamespace#removeRoleFromUser
RQ150303;P_EPA_AUDITORES;47150792P;ns:WebServiceServer_thisNamespace#removeRoleFromUser
Now we have only two lines to proccess, So in order to achieve this and correct me if Im wrong, I need to stored the whole information from the file , processed the first line, erased the file, retrieve the information using the method getNextRequest, store this information using the FileConnector without the information of the row proccesed request, right?
Is it posible to delete the whole file using this connector?
Any recomendation?
Thank you very much again for your help!! Im learning a lot with this :)
If you want to constantly save state then, yes update the CSV as you go. If it is not too large this should be quick. You could add another function saveRequests() that takes the Array and overwrites the CSV file. To ensure correct CSV writing - including escaping quotes, commas and such - have a File Connector configured in the Resources > Connectors folder of your project. Let's say you call it RequestFile. Make sure it has the correct filePath and CSV Parser set correctly. Then your function would look like this:

function saveRequests() {
// getConnector() can return a base template - like ibmdi.FileSystem - or a library
// Connector, as done below
var reqFile = system.getConnector("RequestFile");
var reqEntry = system.newEntry(); // Need an Entry object for later
// Initialize it
reqFile.initialize(null); // Also initializes the Parser
// A for-loop on a Java Array returns member items. For Javascript it returns the index
for (var i in _SVCQ) {
var req = _SVCQ[i];
// Set up Entry to do the write to CSV
reqEntry.UserID = req.usuario;
reqEntry.RequestID = requestid;
reqEntry.Operation = operation;
reqEntry.RollID = rollid;
// Write to the CSV file
reqFile.putEntry(reqEntry);
}
reqFile.terminate(); // Close Connector
}

And of course you've already popped off the Request you handled, so the file will be shorter each time.

Hope this helps!

/Eddie
Antonio Astorga
2019-11-21 09:15:34 UTC
Permalink
Hello Eddie!!
Im applying this code to save the request but i don´t know why Im getting some errors.
First of all, as we know the function addToServiceQueue is used to stored the information that comes from this file. Once i processed the first row, adding or removing a rol to an Identity, I called the function saveRequest();
If I understud the code and the explanation that you gave me, this function will remove the processed line or row so from 6 rows i will have only 5 and this will be repeated until we dont have more rows, right?
So, when im running the project the file that contains 6 rows at the begging of the test, goes directly to store only one row, and this row also adds a new ";" having problems for the next iteration, because the new ";" is the place where we store the "RolId" and now with this, is empty.
Im trying to solve it, but i don´t see it, I don´t see where we are adding this ";" in to the file.
I would like to ask you if i you can help me with this, again? :S

And something more, this Assembly line has no data Feed, and the file will be storing information from another Assembly line, how do we keep reading this file continously?
So Im using a FOR-EACH to read the file at the begging of the Data flow, Do you think this is the correct way to do that?
Thank you again!!
Eddie Hartman
2019-11-22 12:42:33 UTC
Permalink
Post by Antonio Astorga
Hello Eddie!!
Im applying this code to save the request but i don´t know why Im getting some errors.
First of all, as we know the function addToServiceQueue is used to stored the information that comes from this file. Once i processed the first row, adding or removing a rol to an Identity, I called the function saveRequest();
If I understud the code and the explanation that you gave me, this function will remove the processed line or row so from 6 rows i will have only 5 and this will be repeated until we dont have more rows, right?
So, when im running the project the file that contains 6 rows at the begging of the test, goes directly to store only one row, and this row also adds a new ";" having problems for the next iteration, because the new ";" is the place where we store the "RolId" and now with this, is empty.
Im trying to solve it, but i don´t see it, I don´t see where we are adding this ";" in to the file.
I would like to ask you if i you can help me with this, again? :S
And something more, this Assembly line has no data Feed, and the file will be storing information from another Assembly line, how do we keep reading this file continously?
So Im using a FOR-EACH to read the file at the begging of the Data flow, Do you think this is the correct way to do that?
Thank you again!!
In general, if you get something to work then great. The Feed section is just a built-in loop. If you don't have Feed Connectors then the DataFlow is run once. But you can have your set of LOOPs there. You could have a Conditional Loop for the outer cycler with a scripted Condition of: return true;

Inside you can have your FOR-EACH Connector Loop to read the file. I often build ALs with no Feed, doing all the Looping myself

/Eddie
Antonio Astorga
2019-11-26 15:37:18 UTC
Permalink
Thank you very much for yor help Eddie!!

All the code is implemented with a few changues, but everything is working, I was making a few errors, the importances of the hooks.

So, the FOR-EACH Connector should be place in the Data Flow at the beggining of everything, right?

Thank you very much!!
Antonio
Eddie Hartman
2019-11-26 17:59:58 UTC
Permalink
Post by Antonio Astorga
Thank you very much for yor help Eddie!!
All the code is implemented with a few changues, but everything is working, I was making a few errors, the importances of the hooks.
So, the FOR-EACH Connector should be place in the Data Flow at the beggining of everything, right?
Thank you very much!!
Antonio
If I understand what you are trying to do, Antonio, then yes. But there is no prescription for where you put your Loop. You might want some pre-Loop logic, for example in Scripts prior in the AL. And you can have multiple Loops. It's all up to you and what you are trying to solve :)

/Eddie
Antonio Astorga
2019-11-28 11:14:17 UTC
Permalink
Hello Eddie

I have implemented a Conditional loop at the begging of everything with the scripted condition as you mentioned before.
But as it is doing the loop, when there is no queues to process the system keeps printing logs infinitely.
How can avoid this? is there a way to keep it listening but not printing logs?
Regards
Antonio Astorga
2019-11-29 08:19:28 UTC
Permalink
Hello Eddie
Problem solve!!
Thank you for your help
Eddie Hartman
2019-11-29 10:44:21 UTC
Permalink
Post by Antonio Astorga
Hello Eddie
Problem solve!!
Thank you for your help
Nice work, Antonio! And like I often tell people, hit the Debugger button and step through to a problem. It often gives you new insights :)

/Eddie

Loading...