First steps to parsing RSS items as activities. RSS feeds don't seem
to have enough data to make good remote profiles, but this may work
with some "hints".
Parsing hcards for the data we need wasn't hard enough to justify using
hkit. It was dependent on a number of external systems (something to
run tidy), and only could handle XHTML.
We now parse HTML with the PHP dom libraries used elsewhere, and
scrape out our own hcards. Seems to work nicer and faster and most of
all works with Google Buzz profile URLs.
* Subscription::start was sometimes passing users instead of profiles to hooks, which broke OStatus subscription notifications; now normalizing to profiles for processing.
* H-card parsing would trigger a lot of PHP warnings and notices in hKit. Now suppressing warnings and notices for the duration of the call to keep them out of output when display_errors is on.
* H-card parsing would trigger a PHP fatal error if the source page was not well-formed XML and Tidy was not present on the system. Switched normalization to use the PHP DOM module which is always present, as we have no need for Tidy's extra features here.
* Trying to fetch avatars from Google profiles failed and triggered a PHP warning due to the relative URL not being resolved during h-card parsing. Now passing profile page URL into hKit by sneaking a <base> tag in while we normalize the HTML source.
* Profile pages without a "Link" header could trigger PHP notices due to a bad NULL -> array(NULL) conversion in LinkHeader::getLink(). Now checking that there was a return value before converting single return value into array.
Base problem is that our caching-on-insert interferes with relying on column default values; the cached object is missing those fields, so they appear to be empty (null) when the object is retrieved from cache.
Now explicitly setting them when inserting subscriptions, and cleaned up some code that had alternate code paths.
May also have made auto-subscription work for remote OStatus subscribers, but can't test until magic sigs are working again.
For instance this was throwing an exception for DB_DataObject::staticGet when there's no match... definitely not what we want when all our code expects to get a nice null.
Example of this causing trouble: http://gitorious.org/statusnet/mainline/merge_requests/131
Revert "Don't attempt to retrieve the current user from the DB while processing a DB error"
This reverts commit 68347691b0.
Revert "Use PHP exceptions for PEAR error handling."
This reverts commit d8212977ce.
While deletion is in progress, the account is locked with the 'deleted' role, which disables all actions with rights control.
Todo:
* Pretty up the notice on the profile page about the pending delete. Show status?
* Possibly more thorough account disabling, such as disallowing all use for login and access.
* Improve error recovery; worst case is that an account gets left locked in 'deleted' state but the queue jobs have gotten dropped out. This would leave the username in use and any undeleted notices in place.
This bug was hitting a number of places where we had the pattern:
$db->find();
while($dbo->fetch()) {
$x = clone($dbo);
// do anything with $x other than storing it in an array
}
The cloned object's destructor would trigger on the second run through the loop, freeing the database result set -- not really what we wanted.
(Loops that stored the clones into an array were fine, since the clones stay in scope in the array longer than the original does.)
Detaching the database result from the clone lets us work with its data without interfering with the rest of the query.
In the unlikely even that somebody is making clones in the middle of a query, then trying to continue the query with the clone instead of the original object, well they're gonna be broken now.
Now using the correct order consistently (URL, then URI if http/s), and as a niceness measure skipping the redirect if the only URL we have stored is the local one. (Could happen if remote OStatus feed has tag URIs and no alt link.)