Many newspapers seem to use animated GIFs as catchy header images, which
we would fail to show from oEmbed/OpenGraph fetching since they would
want us to "use File as Thumbnail", but the only place the image filename
was stored was in File_thumbnail, for the thumbnail of that file_id which
had a URL set.
We don't guess the current profile anymore if the value of the profile === -1
Also sets $this->scoped for all ScopingNoticeStream inheritors, which just
like in an Action can be null if we're not scoped in any way (logged in).
Foolproof file redirection
This solves an issue when our internal /attachment/{file_id} links are shortened with an remote shorteners (which caused the /attachment/{file_id} links to be saved to the File table and a thumbnail of a thumbnail being generated)
See merge request !98
This would overwrite remote URLs with local verisons which removes source href...
The reason one might have filenames for remote URLs is that StoreRemoteMedia plugin
fetches them and uses the filename field.
The code was so involved there was even a comment asking for a refactor.
Now, File_redirection::where always returns a nice File_redirection
object instead of an array or string or nothing. The object is
either one which already existed or else a new, unsaved object.
Instead of duplicating "does it exist" checks everywhere, do it in
File_redirection::where. You either get what exists or something to save.
An unsaved File_redirection may be paired with an unsaved File.
You will want to save the File first (using ->saveFile()) and put the
id in File_redirection#file_id before saving.
If a new file is uploaded, it will be matched with a previously uploaded
file so we don't have to store duplicates. SHA256 is random enough and
also unlikely enough to cause collisions.
If this merge throws exception on scripts/upgrade.php and you recently
tried a nightly (i.e. during 2015-02-19) then just go back a commit or two
and try again.
Or delete the duplicate entries. Find the entries like this:
SELECT COUNT(*), urlhash FROM file_redirection
GROUP BY urlhash
HAVING COUNT(*) > 1;
then for each urlhash (or come up with a smart SQL query) do:
DELETE FROM file_redirection WHERE urlhash='hashfrompreviousquery' LIMIT 1;
You'll have to remove duplicates more than once if you have >2 identical
urlhash entries. LIMIT -1 might do that for you. I'm not sure.