I can't find the part where the username length is assigned to the key calculation logic. Like, "local_38 != (uStack_84 & 0xffff ^ local_50)" the local_50 here, which is supposed to hold username length value. There's no other initialization for that. Thus it's always 0.
Can somebody explain this to me? |
==> |
13 reference integers, i mean |
==> |
I can't solve it. I looked up other's writeups as well but I still don't get it. The workflow is :
user input key - encoding(00402160-0040218f) - reference integers checksum(004023c0-004023d6) - encoded user input key checksum(004023e3-004023fa)..
so the point is 'how to find a sequence of 'encoded user input key(integers)' that satisfies both the reference checksum when it goes through checksum logic, and being within ASCII range when it's reverse-encoded.
While, the reference checksum is the result of checksum calc that is run with 32 integers. |
==> |