Skip to content

Commit

Permalink
Initial API-centric/JSON overhaul, 2FA login.
Browse files Browse the repository at this point in the history
With Mincka#83, we need a new approach to allow this program to function.
This is the first attempt. Some features have been broken or removed,
and likely cannot be added back. Cards, embedded tweets, etc. have been
dropped, and it's possible that stickers are broken too (do they even
exist anymore?).

I can't promise that this works robustly, but it was tested in Python
3.7 with a saved session; I'm also not sure authentication isn't
totally broken as I tried to implement a 2FA fix, too, but locked
my account out from too many login attempts before I managed to get it
working.
  • Loading branch information
cajuncooks committed Aug 31, 2020
1 parent 2bb0956 commit a66bc4f
Show file tree
Hide file tree
Showing 3 changed files with 203 additions and 262 deletions.
10 changes: 5 additions & 5 deletions dmarchiver/cmdline.py
Original file line number Diff line number Diff line change
Expand Up @@ -12,15 +12,14 @@
Conversation ID
-u, --username Username (e-mail or handle)
-p, --password Password
-t, --token 2FA token code (optional)
-d, --delay Delay between requests (seconds)
-s, --save-session Save the session locally
-di, --download-images
Download images
-dg, --download-gifs Download GIFs (as MP4)
-dv, --download-videos
Download videos (as MP4)
-th, --twitter-handle
Use the Twitter handles instead of the display names
-r, --raw-output Write the raw HTML to a file
"""

Expand All @@ -44,6 +43,7 @@ def main():
parser.add_argument("-id", "--conversation_id", help="Conversation ID")
parser.add_argument("-u", "--username", help="Username (e-mail or handle)")
parser.add_argument("-p", "--password", help="Password")
parser.add_argument("-t", "--token", default=None, help="2FA token")
parser.add_argument("-d", "--delay", type=float, default=0, help="Delay between requests (seconds)")
parser.add_argument(
"-s",
Expand Down Expand Up @@ -94,7 +94,7 @@ def main():

crawler = Crawler()
try:
crawler.authenticate(username, password, args.save_session, args.raw_output)
crawler.authenticate(username, password, args.save_session, args.raw_output, args.token)
except PermissionError as err:
print('Error: {0}'.format(err.args[0]))
print('Exiting.')
Expand All @@ -113,15 +113,15 @@ def main():
conversation_id,
args.delay,
args.download_images,
args.download_gifs, args.download_videos, args.twitter_handle, args.raw_output)
args.download_gifs, args.download_videos, args.raw_output)
else:
print('Conversation ID not specified. Retrieving all the threads.')
threads = crawler.get_threads(args.delay, args.raw_output)
print('{0} thread(s) found.'.format(len(threads)))

for thread_id in threads:
crawler.crawl(thread_id, args.delay, args.download_images,
args.download_gifs, args.download_videos, args.twitter_handle, args.raw_output)
args.download_gifs, args.download_videos, args.raw_output)
time.sleep(args.delay)
except KeyboardInterrupt:
print('Script execution interruption requested. Exiting.')
Expand Down
Loading

0 comments on commit a66bc4f

Please sign in to comment.