FormNoticeXHR now is triggered on any form labeled with class 'ajax-notice', so those other than the traditional notice form should work as long as they handle the AJAX submission and return a properly formatted notice.
Things to watch out for:
* to determine whether the resulting notice should show on the current timeline, the JS code needs to be able to check the author and such. Keeping the existing vcard bits helps for this!
* the notice form submission stuff clears out inputs from your form -- test to make sure this behaves correctly
* error messages returned from the thingy _should_ come through, but this needs more testing for consistency
* while form components that aren't in a custom form should just be ignored, this should be tested more. (eg there's no location or attachment box for poll or bookmark plugins)
* NoticeListItem isn't currently reachable via autoloader -- touch NoticeList explicitly before calling into it for now.
Note that changes to the attachment from <label for/><input id/> to <label><input></label> affect some of the existing styles which attempt to place them both in the same place based on having a common parent. Only 'neo' has been fully tested and fixed for this case, as the others all fail due to the new layout anyway. :)
* '1.0.x' of gitorious.org:statusnet/mainline:
Initial checkin of Poll plugin: micro-app to post mini polls/surveys from the notice form.
Localisation updates from http://translatewiki.net.
More doc comments on MicroApp stuff; some of the show-notice code & the ActivityStreams stuff is a bit wonky and may need smoothing out
Doc comments for MicroAppPlugin
mailboxes were wrongly overriding global menu
This version is fairly basic; votes do not (yet) show a reply, they just got in the table. No pretty graphs for the results yet, just text.
The ActivityStream output is temporary and probably should be replaced; the current structures for adding custom data aren't really ready yet (especially since we need to cover JSON and Atom formats, probably pretty differently)
Uses similar system as Bookmark for attaching to notices -- saves a custom URI for an alternate action, which we can then pass in and hook back up to our poll object. This can probably do with a little more simplification in the parent MicroAppPlugin class.
Currently adds two tables:
- poll holds the main poll info: id and URI to associate with the notice, then the question and a text blob with the options.
- poll_response records the selections picked by our nice fellows.
Hopefully no off-by-one bugs left in the selection, but I give no guarantees. ;)
Some todo notes in the README and in doc comments.
* 1.0.x: (68 commits)
Avoid AJAX fetch delay for inline replies when possible; we clone a copy of the notice form skeleton at initialization, then insert it in place instead of fetching a new one.
Fix bad reference
lost a </div> in input_forms
neo is the default
First version of 3cl theme neo.
cleaner is the new default theme (for now)
store reply_to notices as comment activity objects
fix object errors with bookmark notices
save the object type when saving a new bookmark notice
ActivityObject uses Notice's object_type by default
Notice saves its object type
show correct notice in output
UR FACE
wrapper div for primary nav
Revert "abstraction for starting and ending a menu"
Revert "primarynav uses menustart and menuend"
primarynav uses menustart and menuend
abstraction for starting and ending a menu
remove adminpanelnav from adminpanelaction module
Input form switcher works
...
Location information removed from translation files with msgmerge --no-location to decrease size of files and reduce diff size. Unfortunately there does not appear to be a setting in msgmerge or msgattrib to remove the extracted comments ("#.") from translation files. If you do know of such a switch, please let me know!
* 1.0.x:
* translator documentation updated. * superfluous whitespace removed. * small refactoring in noticeform.php to allow proper translator hints.
* translator documntation updated * superfluous whitespace remove * minor L10n and i18n updates
Cleanup & minification for migration to reusable notice form in inline replies. Yay!
Work in progress: inline reply form reusing the main reply form now inserts the successful result more or less right
style fixes for new notice form being reused in reply area
Reusable notice form fixes for geolocation
Loading the original form instead of faking up our own. Sorta works but not pretty :D
Kill some more hardcoded ids...
More hardcoded id cleanup in notice form...
'link' to 'links' in feed document
As a hack this removes the mysql_timestamp bit from the field settings on reply.modified so that our value actually gets saved. This *should* work ok as long as system timezone is set correctly, which we now set to UTC to match when connecting.
Changes the replacement of Twitter "entities" from in-place reverse ordering ('to preserve indices') to a forward-facing append-in-chunks that pulls in both the text and link portions, and escapes them all.
This unfortunately means first *de*-escaping the < and > that Twitter helpfully adds for us.... and any literal &blah;s that get written. This seems to match Twitter's web UI, however horrid it is.
http://status.net/open-source/issues/2442
Notes:
* Mapstraction causes JavaScript errors in XHTML mode, breaking our code if we're run later so the link doesn't work to get back to Desktop.
* not 100% sure how safe feature detection is here?
* Currently will be useless but visible links if no JS available; need to fall back to server-side for limited browsers
Now using the original text form of @-mentions and #-tags, as in Twitter's own HTMLification.
Canonical forms are still used in generating links, where it's polite to match the canonical form.
A repeat/retweet is roughly equivalent to an active direct post, so should follow the posting rules, rather than always sending over as we do for fave notifications.
Output from 0.9.6 PuSH feeds seems to have a rump <author> but no
<activity:actor>. It was overwriting valid and useful data set up at
subscribe time.
This fix tries to avoid overwriting data. However, it may prevent
updates that delete data.
Bug: 3028
Should fix issue #3027: twitter user avatars not getting imported.
Due to the change in URI, all twitter users that had been previously seen were getting new profile entries, which tried to save the same avatar. This would fail as Avatar.url has a unique index.
Note: now anything new seen in the last couple days in production will still potentially conflict.
Shows the messages to a private group in a list. New classes for
showing a group private message and list of group private messages.
New actions for showing a stream of group private messages and a
single group private message.
In order to apply to PHP's POST processing, the MAX_FILE_SIZE field must appear *before* the file upload field. They were incorrectly placed after, where they had no effect on POST processing.
There's a new menu layout in this version of the software. It was
implemented as a plugin in 0.9.x to avoid clashes with existing themes,
but we're going to break that compatibility in this version, so we're just going for it.
This change involved moving all the changes in NewMenuPlugin into the
default code that was calling it. In addition, since
accountsettingsaction and connectsettingsaction differed only by menu,
I removed them, changed all references to them to the settingsmenu, and moved
the combined nav to its own class.
Let's put that episode behind us.
The CSS shim that was loaded by NewMenuPlugin for certain themes and certain actions
was removed.
'admin' is a pretty common username that people try when installing;
it was blacklisted because all of our admin panels were at /admin/*,
which would conflict with the admin user's namespace.
Changed the location of all admin panels to /panel/*, blacklisted the
nickname 'panel', and allowed 'admin'. Tested with a fresh install;
seems to work great.
Original fixes in c169dcb5221cf3dd452c291bf97374bb459cc5b9; didn't get merged in 39cad55711 because the code had been broken out to another file, but manual merge went smooth.
These affect twitterstatusfetcher.php on all 32-bit installs and some 64-bit installs (depending on whether the version of the JSON library reads the large numbers as long or double internally). 64-bit bug is harder to see as it tends to manifest as off-by-one due to losing a bit of precision off the end.
Note that the current version of the infinitescroll jquery plugin fixes this, but I'm not updating to it because the code's been altered from the upstream version, apparently to stop it from actually working as infinite scroll. WTF? :)
Note that these tests won't pass on master branch yet as the join/leave don't work, and there's a bug in Activity parsing which prevents interop between new feeds and old remote subscribers (both fixed in this branch).
Given a notice in the local system, we package it up as an Atom entry and MagicSig it up.
We run the magicenv verification on it locally to make sure our own functions can decode it.
Optionally with --verify we can send to Tuomas Koski's verification test service (not sure if this is working 100%)
If given --slap= with a target Salmon endpoint, we'll sent it on and see if it liked it. (Note that StatusNet will reject if there's not a relevant mention, but will report acceptance for dupes so you can use a message that's already been delivered as a test.)
Added StartRegistrationTry/EndRegistrationTry calls into those three, and moved the actual recording hook to EndUserRegister which is guaranteed to be called from User::register (so we don't need to worry about other auth methods forgetting to call the other UI-code hooks).
We were passing DOM nodes directly into the queues for the final bookmark import stage; unfortunately these don't actually survive serialization.
Moved the extraction of properties from the HTML up to the first-stage handler, so now we don't have to worry about moving DOM nodes from one handler to the next. Instead passing an associative array of properties, which is fed into the Bookmark::saveNew by the per-bookmark handler.
delicious bookmark exports use the godawful HTML bookmark file format that ancient versions of Netscape used (and has thus been the common import/export format for bookmarks since the dark ages of the web :)
This arranges bookmark entries as an HTML definition list, using a lot of implied close tags (leaving off the </dt> and </dd>).
DOMDocument->loadHTML() uses libxml2's HTML mode, which generally does ok with muddling through things but apparently is really, really bad about handling those implied close tags.
Sequences of adjacent <dt> elements (eg bookmark without a description, followed by another bookmark "<dt><dt>"), end up interpreted as nested ("<dt><dt></dt></dt>") instead of as siblings ("<dt></dt><dt></dt>").
The first round of code tried to resolve the nesting inline, but ended up a bit funky in places.
I've replaced this with a standalone run through the data to re-order the elements, based on our knowing that <dt> and <dd> cannot directly contain one another; once that's done, our main logic loop can be a bit cleaner. I'm not 100% sure it's doing nested sublists correctly, but these don't seem to show up in delicious export (and even if they do, with the way we flatten the input it shouldn't make a difference).
Also fixed a clearer edge case where some bookmarks didn't get imported when missing descriptions.
I was trying to generate URIs for Bookmarks based on (profile, crc32(url), created).
I failed at that. CRC32s are unsigned ints, and our schema code didn't like that.
On top of that, my code to encode and restore created timestamps was problematic.
So, I switched back to using a meaningless unique ID for Bookmarks.
One way to do this would be to use an auto-incrementing integer ID. However, we've been
kind of crabbed out a few times for exposing auto-incrementing integer IDs as URIs, so
I thought maybe using a random UUID would be a better way to do it.
So, this patch sets random UUIDs for URIs of bookmarks.
Had some problems with PuSH and Salmon use of Bookmarks; they were
being required to generate Atom versions of the bookmark _before_ the bookmark was saved.
So, I reversed the order of how things are saved, and associate notices and bookmarks
by URI rather than notice_id.
Form for saving bookmarks that looks like the delicious.com form.
Save a new notice with the right text, but attach a new notice_bookmark
table which marks this as a bookmark. Tags, URLs are kept the same.
Fixes for Twitter bridge breakage on 32-bit servers. New "Snowflake" 64-bit IDs have become too big to fit in the integer portion of double-precision floats, so to reliably use these IDs we need to pull the new string form now.
Machines with 64-bit PHP installation should have had no problems (except on Windows, where integers are still 32 bits)
Conflicts:
plugins/TwitterBridge/twitterimport.php <- as this hasn't been broken out, the import code is NOT FULLY UPDATED HERE.
Some of our caching systems, like the disk cache or memcached, have
significant overhead (network connections or disk I/O).
This plugin adds an additional layer of in-process cache, so we don't
need to reconnect to external cache systems when we've already
received a data item from the cache. There are some concurrency issues
here, but typically they won't be important at the level of a single
web hit.
Piwik's current default recommended JS for loading creates a <script> tag via document.write(). In addition to being generally evil, this means the browser doesn't know it's going to need piwik.js until that chunk of script gets executed... which can't happen until all scripts referenced *before* it have been loaded and executed.
The only reason for that bit of script though seems to be to pick 'http' or 'https' depending on the current page's scheme. This can be done more simply by using a protocol-relative link (eg "//piwik.status.net/piwik.js"), which the browser will resolve as appropriate. Since it's now sitting in the <script> tag, the browser's lookahead code will now see it and be able to start loading it while earlier things are parsing/executing.
May be better still to move to an asynchronous load after DOM-ready, but I'm not sure if that'll screw with the analytics code (eg, not being able to start things on the DOM-ready events since they're past).
The default full build of OpenLayers.js is 943kb as of 2.10; this gzips down to a couple hundred kb
but is still rather nasty, plus loading it off a remote host could slow things down.
Using a local copy let us cut down the size significantly by discarding unused features, and further
minification with yui-compressor shaves a bit more off. Cuts down to about 1/5 the size of the
original.
Also threw in a bundled & minified copy of the Mapstraction classes plus our usermap.js,
which covers the common case of using the default OpenLayers provider. This cuts out three
additional script loads, two of which weren't getting launched until after the mxn.js main
file got loaded.
Included Makefile will recreate the OpenLayers.js using the statusnet.cfg strip configuration file
and yui-compressor to do some extra minification at the end. Requires fetching the OpenLayers
source download and dropping it in:
http://openlayers.org/download/OpenLayers-2.10.tar.gz
$config['twitter']['ignore_errors'] = true;
A longer-term solution is to patch up the indirect retry handling to count retries better, or delay for later retry sensibly.
common_shorten_links() can only access the web session's logged-in user, so never properly took user options into effect for posting via XMPP, API, mail, etc.
Adds an optional $user parameter on common_shorten_links(), and a $user->shortenLinks() as a clearer interface for that.
Tweaked some lower-level functions so $user gets passed down -- making the $notice_id param previously there for saving URLs at notice save time generalized a little.
Note also ticket #2919: there's a lot of duplicate code calling the shortening, checking the length, and reporting near-identical error messages. These should be consolidated to aid in code and translation maintenance.
Separating the two forms (one to create a local account, the other to attach the OpenID to an existing account) gets them working -- enter activates the appropriate default button.
We were clearing the counter on the window title in the blur event, which gets fired *after* we switch away, thus triggering Firefox to mark the tab as updated again.
Clearing the counter on *focus* instead avoids this, and keeps the counter out of the way as well.
Identified several bugs and fixmes, and added more thorough labeling of the issues with replicating the entire HTML structure of notices (no i18n, missing new features, maintenance problems, possible other issues)
Most annoying error case being where the notice was already faved or deleted on Twitter! :)
Such errors will now just fail out and log a note to the syslog -- the rest of what we were doing will continue on unhindered, so you can still delete, favorite, etc and it just won't sync the info over in that case.
This option may be useful for intranet sites that don't have direct access to the internet, as they may be unable to successfully fetch those resources.
When the retweet failed with a 403 error (say due to it being a private tweet, which can't be retweeted) we would end up mishandling the return value from our internal error handling.
Instead of correctly discarding the message and closing out the queue item, we ended up trying to save a bogus twitter<->local ID mapping, which threw another exception and lead the queue system to re-run it.
- Fixed the logic check and return values for the retweet case in broadcast_twitter().
- Added doc comments explaining the return values on some functions in twitter.php
- Added check on Notice_to_status::saveNew() for empty input -- throw an exception before we try to actually insert into db. :)
Added the necessary classes to send email summaries. First, added a
script to run on a daily basis. Second, added a queue handler for
sending email summaries for users, and another to queue summaries for
all users on the site. Fixed up the email_summary_status table to
store the last-sent notice id, rather than a datetime (since we don't
support 'since' parameters anymore). Finally, made the plugin class
load the right modules when needed.
Pulled common code for the profile page and profile list cases to give them the same logic on checking. Also fixes the problem that you'd get a flag button for yourself in profile lists, while we explicitly exclude that from the profile page -- it's now skipped in both places.
StatusNet core code now sets the tooltip text on .attachment.more links when they receive their attachment-expansion magic; this will override the hardcoded tooltip text saved from OStatus plugin when displaying timelines in the web UI.
Data are sent to the 'info' level of logging, like so:
[lazarus.local:4812.86b23603 GET /mublog/api/statuses/friends_timeline.atom?since_id=1353]
STATLOG action:apitimelinefriends method:GET ssl:no query:since_id cookie:no auth:yes
ifmatch:no ifmod:no agent:Appcelerator Titanium/1.4.1 (iPhone/4.1; iPhone OS; en_US;)
Fields:
* action: case-normalized name of the action class we're acting on
* method: GET, POST, HEAD, etc
* ssl: Are we on HTTPS? 'yes' or 'no'
* query: Were we sent a query string? 'yes', 'no', or 'since_id' if the only parameter is a since_id
* cookie: Were we sent any cookies? 'yes' or 'no'
* auth: Were we sent an HTTP Authorization header? 'yes' or 'no'
* ifmatch: Were we sent an HTTP If-Match header for an ETag? 'yes' or 'no'
* ifmod: Were we sent an HTTP If-Modified-Since header? 'yes' or 'no'
* agent: User-agent string, to aid in figuring out what these things are
The most shared-cache-friendly requests will be non-SSL GET requests with no or very predictable
query parameters, no cookies, and no authorization headers. Private caching (eg within a supporting
user-agent) could still be friendly to SSL and auth'd GET requests.
We kind of expect that the most frequent hits from clients will be GETs for a few common timelines,
with auth headers, a since_id-only query, and no cookies. These should at least be amenable to
returning 304 matches for etags or last-modified headers with private caching, but it's very
possible that most clients won't actually think to save and send them. That would leave us expecting
to handle a lot of timeline since_id hits that return a valid API response with no notices.
At this point we don't expect to actually see if-match or if-modified-since a lot since most of our
API responses are marked as uncacheable; so even if we output them they're not getting sent back to
us.
Random subsampling can be enabled by setting the 'frequency' parameter smaller than 1.0:
addPlugin('ApiLogger', array(
'frequency' => 0.5 // Record 50% of API hits
));
If someone tries to register from an IP address that a silenced user
has registered from, prevent it.
When silencing someone, silence everyone else who registered from the
same IP address.