UserActivityStream -- used to create a full activity stream including subscriptions, favorites, notices, etc -- normally buffers everything into memory at once. This is infeasible for accounts with long histories of serious usage; it can take tens of seconds just to pull all records from the database, and working with them all in memory is very likely to hit resource limits.
This commit adds an alternate mode for this class which avoids pulling notices until during the actual output. Instead of pre-sorting and buffering all the notices, empty spaces between the other activities are filled in with notices as we're making output. This means more smaller queries spread out during operations, and less stuff kept in memory.
Callers (backupaccount action, and backupuser.php) which can stream their output pass an $outputMode param of UserActivityStream::OUTPUT_RAW, and during getString() it'll send straight to output as well as slurping the notices in this extra funky fashion.
Other callers will let it default to the OUTPUT_STRING mode, which keeps the previous behavior.
There should be a better way to do this, swapping out the stringer output for raw output more consitently.
Moved most of the heavy-lifting for account restoration out of
restoreuser.php and into its own class, with the hope that we'll do
the work from the Web eventually.
common_shorten_links() can only access the web session's logged-in user, so never properly took user options into effect for posting via XMPP, API, mail, etc.
Adds an optional $user parameter on common_shorten_links(), and a $user->shortenLinks() as a clearer interface for that.
Tweaked some lower-level functions so $user gets passed down -- making the $notice_id param previously there for saving URLs at notice save time generalized a little.
Note also ticket #2919: there's a lot of duplicate code calling the shortening, checking the length, and reporting near-identical error messages. These should be consolidated to aid in code and translation maintenance.
* add some sanity checking: abort on failures instead of plodding through
* add some progress / error output
* fetch the target database server name from the status_network entry and use that to target the DROP DATABASE
Note that database names and other overrides in status_network entry may still not be seen.
* Moved notification sending from Notice::saveReplies to distrib queue handler, so it'll pull from the reply set we've saved regardless of how we got it.
* Set up gettext infrastructure for command-line scripts; gets localization mail notifications etc working from background queues.
* Adjusted locale switching: common_switch_locale() works at runtime for bg scripts, forces a message catalog update