Discussion:
Copy data from Directory server to LDIF
(too old to reply)
g***@gmail.com
2018-06-30 15:11:08 UTC
Permalink
Raw Message
Hi, currently i'm working copy some TDS V5.2 data to a LDIF file the directory gave 8M entries, but im facing some unspected servers shutdowns at any time of the process.

there's any recommendation to do this task?

is better to use paged search or virtual paged search for this task?

On the previous environment we do this task with 1.2M entries and didnt face this unexpected shutdowns.
Franzw
2018-07-02 06:27:55 UTC
Permalink
Raw Message
Post by g***@gmail.com
Hi, currently i'm working copy some TDS V5.2 data to a LDIF file the directory gave 8M entries, but im facing some unspected servers shutdowns at any time of the process.
there's any recommendation to do this task?
is better to use paged search or virtual paged search for this task?
On the previous environment we do this task with 1.2M entries and didnt face this unexpected shutdowns.
Why not use the db2ldif utility from TDS and then work with that file afterwards in TDI if you need to manipulate that data ?

TDS 5.2 is so old you will get huge problems in getting anyone to look into the problem as it has been EOS for almost a decade now IIRC...

HTH
Regards
Franz Wolfhagen
yn2000
2018-07-02 13:56:09 UTC
Permalink
Raw Message
Agree. If db2ldif utility can solve the issue, then please ignore this post.
I just wanted to comments on this symptoms of "...unexpected servers shutdowns..." and "...with 1.2M entries and didn't face this unexpected shutdowns..." I believe the main issue is about the performance and capacity, where in this case, the TDI received too many data at once, more than it can chew. So, usually, TDI tuning from various IBM publication would help. Based on your specific circumstance, here are something that you can focus on:
- Tune JVM. This is a must.
- Configure page size. This is also a must.
- Because you need only 'some' data, then control the return attributes to free up memory.
Rgds. YN.
g***@gmail.com
2018-07-03 16:42:34 UTC
Permalink
Raw Message
Post by yn2000
Agree. If db2ldif utility can solve the issue, then please ignore this post.
- Tune JVM. This is a must.
- Configure page size. This is also a must.
- Because you need only 'some' data, then control the return attributes to free up memory.
Rgds. YN.
hi yn200, seems that the db2ldif utility is corrupted of generate the file with some null chars.

i tune the JVM parameters, also configure a page size of 1000 entries but nothing works.

After a wile i manage to read another directory of the current cluster and the process end without problems, my guess? some network issues or something miss-configured on the ldap node, because after some test i get an "unwilling to perform" error.
g***@gmail.com
2018-07-03 16:39:14 UTC
Permalink
Raw Message
Post by Franzw
Post by g***@gmail.com
Hi, currently i'm working copy some TDS V5.2 data to a LDIF file the directory gave 8M entries, but im facing some unspected servers shutdowns at any time of the process.
there's any recommendation to do this task?
is better to use paged search or virtual paged search for this task?
On the previous environment we do this task with 1.2M entries and didnt face this unexpected shutdowns.
Why not use the db2ldif utility from TDS and then work with that file afterwards in TDI if you need to manipulate that data ?
TDS 5.2 is so old you will get huge problems in getting anyone to look into the problem as it has been EOS for almost a decade now IIRC...
HTH
Regards
Franz Wolfhagen
Hi franzw, seems like the db2ldif utility is corrupted and generate the ldif file with some incorrect data or with some strange characters that is not supported by the new directory.
Franzw
2018-07-04 07:28:16 UTC
Permalink
Raw Message
Post by g***@gmail.com
Post by Franzw
Post by g***@gmail.com
Hi, currently i'm working copy some TDS V5.2 data to a LDIF file the directory gave 8M entries, but im facing some unspected servers shutdowns at any time of the process.
there's any recommendation to do this task?
is better to use paged search or virtual paged search for this task?
On the previous environment we do this task with 1.2M entries and didnt face this unexpected shutdowns.
Why not use the db2ldif utility from TDS and then work with that file afterwards in TDI if you need to manipulate that data ?
TDS 5.2 is so old you will get huge problems in getting anyone to look into the problem as it has been EOS for almost a decade now IIRC...
HTH
Regards
Franz Wolfhagen
Hi franzw, seems like the db2ldif utility is corrupted and generate the ldif file with some incorrect data or with some strange characters that is not supported by the new directory.
Have you checked the code pages of the servers ?

The database codepage is defined in your etc/ibmslapd.conf file as ibm-slapdSetenv.

If you have a mismatch there it may explain the "strange" characters.

HTH
Regards
Franz Wolfhagen
g***@gmail.com
2018-07-05 15:07:09 UTC
Permalink
Raw Message
Post by Franzw
Post by g***@gmail.com
Post by Franzw
Post by g***@gmail.com
Hi, currently i'm working copy some TDS V5.2 data to a LDIF file the directory gave 8M entries, but im facing some unspected servers shutdowns at any time of the process.
there's any recommendation to do this task?
is better to use paged search or virtual paged search for this task?
On the previous environment we do this task with 1.2M entries and didnt face this unexpected shutdowns.
Why not use the db2ldif utility from TDS and then work with that file afterwards in TDI if you need to manipulate that data ?
TDS 5.2 is so old you will get huge problems in getting anyone to look into the problem as it has been EOS for almost a decade now IIRC...
HTH
Regards
Franz Wolfhagen
Hi franzw, seems like the db2ldif utility is corrupted and generate the ldif file with some incorrect data or with some strange characters that is not supported by the new directory.
Have you checked the code pages of the servers ?
The database codepage is defined in your etc/ibmslapd.conf file as ibm-slapdSetenv.
If you have a mismatch there it may explain the "strange" characters.
HTH
Regards
Franz Wolfhagen
Hi franz,

I check this and both servers V6.4 and V5.2 have ibm-slapdSetenv to 1208
yn2000
2018-07-05 18:00:24 UTC
Permalink
Raw Message
Oooo... it seems that you are migrating data from TDS v5.2 to SDS v6.4. Why don't you say so? Because, there could be pitfall that member of this group can help you better. Also, I never heard (encountered) a corrupted db2ldif command, because the output LDIF file is created regardless what the outcome is. So, maybe when you run db2ldif command without -j option, where all operational attributes are part of the output LDIF file, in which you are looking at it as 'strange' characters in that operational attribute data.
Also, it seems that, you did mentioned that you only need 'some data', which lead to use TDI, right? And, you solved your issue by pointing TDI to the other LDAP node, which I presumed to be a replicated LDAP node? The solution is a far fetched, but if your problem is solved, then so be it.

Yes, I heard Franz rumbling behind my neck now. :-)
Rgds. YN.
g***@gmail.com
2018-07-05 19:07:15 UTC
Permalink
Raw Message
Post by yn2000
Oooo... it seems that you are migrating data from TDS v5.2 to SDS v6.4. Why don't you say so? Because, there could be pitfall that member of this group can help you better. Also, I never heard (encountered) a corrupted db2ldif command, because the output LDIF file is created regardless what the outcome is. So, maybe when you run db2ldif command without -j option, where all operational attributes are part of the output LDIF file, in which you are looking at it as 'strange' characters in that operational attribute data.
Also, it seems that, you did mentioned that you only need 'some data', which lead to use TDI, right? And, you solved your issue by pointing TDI to the other LDAP node, which I presumed to be a replicated LDAP node? The solution is a far fetched, but if your problem is solved, then so be it.
Yes, I heard Franz rumbling behind my neck now. :-)
Rgds. YN.
haha, well i try to generate the db2ldif without the operational attributes but get the same result when i was trying to load the ldif, that's why we use the SDI to generate and create the ldif.
yn2000
2018-07-05 20:43:30 UTC
Permalink
Raw Message
So, the last post indicating a new fact. The issue is in the ldif2db command, not in the db2ldif anymore, right? "...when i was trying to load the ldif...", you are using ldif2db command, right? If yes, then you may need to check the first entry of that LDIF output file and/or the cryptography between those two servers. If you are not using ldif2db command, then it is another different ball game, due to the existence of internal attributes such as ibm-entryUUID.
Rgds. YN.
g***@gmail.com
2018-07-05 22:25:32 UTC
Permalink
Raw Message
Post by yn2000
So, the last post indicating a new fact. The issue is in the ldif2db command, not in the db2ldif anymore, right? "...when i was trying to load the ldif...", you are using ldif2db command, right? If yes, then you may need to check the first entry of that LDIF output file and/or the cryptography between those two servers. If you are not using ldif2db command, then it is another different ball game, due to the existence of internal attributes such as ibm-entryUUID.
Rgds. YN.
well, i dont think so.. here's what i do now.

*read ldapusint SDI and create LDIF.
*load LDIF using bulkload and works.

*before this i generate the LDIF and use bulkload but failed.
*then i try to load the entries using SDI but failed too.
g***@gmail.com
2018-07-05 22:37:56 UTC
Permalink
Raw Message
Post by yn2000
So, the last post indicating a new fact. The issue is in the ldif2db command, not in the db2ldif anymore, right? "...when i was trying to load the ldif...", you are using ldif2db command, right? If yes, then you may need to check the first entry of that LDIF output file and/or the cryptography between those two servers. If you are not using ldif2db command, then it is another different ball game, due to the existence of internal attributes such as ibm-entryUUID.
Rgds. YN.
well, i dont think so.. here's what i do now.

*read ldapusint SDI and create LDIF.
*load LDIF using bulkload and works.

Before taking this solution i was doing this:

*generate the LDIF (db2ldif) and use bulkload but failed.
*then i try to load the entries using SDI but failed to(reading the db2ldif ldif).
Franzw
2018-07-05 20:34:53 UTC
Permalink
Raw Message
Post by yn2000
Oooo... it seems that you are migrating data from TDS v5.2 to SDS v6.4. Why don't you say so? Because, there could be pitfall that member of this group can help you better. Also, I never heard (encountered) a corrupted db2ldif command, because the output LDIF file is created regardless what the outcome is. So, maybe when you run db2ldif command without -j option, where all operational attributes are part of the output LDIF file, in which you are looking at it as 'strange' characters in that operational attribute data.
Also, it seems that, you did mentioned that you only need 'some data', which lead to use TDI, right? And, you solved your issue by pointing TDI to the other LDAP node, which I presumed to be a replicated LDAP node? The solution is a far fetched, but if your problem is solved, then so be it.
Yes, I heard Franz rumbling behind my neck now. :-)
Rgds. YN.
Nah - just training my psychic abilities.... :-)

Why do I have to guess what these characters are - where are examples of commends and data...

Regards
Franz Wolfhagen
Loading...