Problem:
Extract structured data from semi-structured logs.
Now you can do it in seconds.
Solution:
Filter logs by text, severity, importance... (GLF record has 36 fields).
Don’t know what to filter? Just check available values.
Still not sure? Check the cheat sheet or just filter, for example, for lines that have numbers and then narrow down.
Or filter by error severity and check their frequency to start working on the most frequent one.
Extract useful data (durations, number of rows, user names, report names, …) from filtered lines in one or several columns, just by using regular expressions.
Analyze results in place: sort, count, summary, jump back to tree structure to get a detailed or short stack trace, visualize duration distribution between hosts/processes/threads/… or export to Excel.
Example:
WEBI refresh.
E2E (end-to-end) GLF log set for 38k lines if you know how to filter them from potentially millions, but BiWhy will do it for you.
E2E duration: 306 seconds.
JAVA/HANA execute: 409 calls, sum time: 77.8 sec.
JAVA/HANA fetch: 409 calls, sum time: 110.3 sec.
We already see where the problem is. No need to be an expert in GLF logs reading to find it.
Can you find it manually in a reasonable time? Even knowing where to look, when the longest call is just 5sec?
Just cherry on the top:
Fetched rows: 1,225 with 245 trips