Sometimes systems have _old_ DB_DataObject classes lying around that
get included by default, so we just try to avoid anything that we don't
ship ourselves.
<MMN-o> BeS: I'll commit a patch that will make this issue go away
<BeS> MMN-o: that would be awesome!
<MMN-o> but it might upset bashrc who's working on a Debian package (where you're _supposed_ to include from /usr/php etc. :P)
<MMN-o> but I'll leave a comment along with it
New plugins:
* LRDD
LRDD implements client-side RFC6415 and RFC7033 resource descriptor
discovery procedures. I.e. LRDD, host-meta and WebFinger stuff.
OStatus and OpenID now depend on the LRDD plugin (XML_XRD).
* WebFinger
This plugin implements the server-side of RFC6415 and RFC7033. Note:
WebFinger technically doesn't handle XRD, but we serve both that and
JRD (JSON Resource Descriptor), depending on Accept header and one
ugly hack to check for old StatusNet installations.
WebFinger depends on LRDD.
We might make this even prettier by using Net_WebFinger, but it is not
currently RFC7033 compliant (no /.well-known/webfinger resource GETs).
Disabling the WebFinger plugin would effectively render your site non-
federated (which might be desired on a private site).
Disabling the LRDD plugin would make your site unable to do modern web
URI lookups (making life just a little bit harder).
Action classes can now be run by calling the static function 'run'.
Eventually actions will be migrated so most functionality gets put
into parent classes, and the children don't have to have as much
duplicate code as they have now.
$config['site']['logperf'] = true; // to record & dump total hits of each type and the runtime to syslog
$config['site']['logperf_detail'] = true; // very verbose -- dump the individual cache keys and queries as they get used (may contain private info in some queries)
Seeing 180 cache gets on a timeline page seems not unusual currently; since these run in serial, even relatively small roundtrip times can add up heavily.
We should consider ways to reduce the number of round trips, such as more frequently storing compound objects or the output of processing in memcached.
Doing parallel multi-key lookups could also help by collapsing round-trip times, but might not be easy to fit into SN's object model. (For things like streams this should actually work pretty well -- grab the list, then when it's returned go grab all the individual items in parallel and return the list)
For instance this was throwing an exception for DB_DataObject::staticGet when there's no match... definitely not what we want when all our code expects to get a nice null.
Example of this causing trouble: http://gitorious.org/statusnet/mainline/merge_requests/131
Revert "Don't attempt to retrieve the current user from the DB while processing a DB error"
This reverts commit 68347691b0.
Revert "Use PHP exceptions for PEAR error handling."
This reverts commit d8212977ce.
For instance this was throwing an exception for DB_DataObject::staticGet when there's no match... definitely not what we want when all our code expects to get a nice null.
Example of this causing trouble: http://gitorious.org/statusnet/mainline/merge_requests/131
Revert "Don't attempt to retrieve the current user from the DB while processing a DB error"
This reverts commit 68347691b0.
Revert "Use PHP exceptions for PEAR error handling."
This reverts commit d8212977ce.
Flow:
1. Browser (IE7) is redirected to the login page.
2. Browser reads the page, sees OpenSearch descriptions, tries to
download them. Each request gets recorded by SN as the page the user
should be redirected to after logging in (returnto).
3. User logs in, then gets redirected to the returnto action, which is
an OpenSearch description.
The OpenSearch descriptions aren't sensitive so making them public in a
private site should be okay.
(I recall fixing this in 0.8.x... :-( )
- switch 'en_US' to 'en', fixes the "admin panel switches to Arabic" bug
- tweak setting descriptions to clarify that most of the time we'll be using browser language
- add a backend switch to disable language detection (should this be exposed to ui?)
Adds a robots.txt file to the site root. Defaults defined by
'robotstxt' section of config. New events StartRobotsTxt and
EndRobotsTxt to let plugins add information. Probably not
useful if path is not /, but won't hurt anything, either.