?

Log in

No account? Create an account
Previous Entry Share Next Entry
LiveJournal coding
photogeek
crschmidt
I sometimes wonder if Brad expects me to do things like I did here anymore. I've done it often enough in the past that it's kind of something that wouldn't surprise me, but I wonder if he even realizes I'm the same guy that whines all the time. Just pondering.

Life is good. Work is good. The house is a mess, but it always is. The commune is much less communist lately: acerbic has been gone too much.

I'm thinking about starting to post friends only again, for stuff that I don't want to be googleable. There's a lot of people who read my journal, but aren't on my friends list.

If I were to start posting friends only, and didn't want to read your entries, would you rather be...

on my friends list, but out of my default view
40(81.6%)
not on my friends list
9(18.4%)

Tell me a secret?



Note that those of you who don't have LiveJournal accounts can not fill out this poll. The user interface doesn't really make that clear.

A bunch more comic feeds got suspended. I'm not replacing any of them, for those of you who might wonder. Once might be something I could claim ignorance, but twice is really just pushing it, and is only likely to get someone with a lot of money to sue my pants off.

I'm getting a new cell phone! It's going to be so cool. In the meantime, my cell is dead at the moment. The new phone should be here Friday or Saturday. It's a Nokia 6600, which is pretty much the top of the line in the US. (Europe's phones are 3 times as cool, and Japan's are twice as cool again.)

I'm sure that there are more important things to be posting here, but I can't think of them.

  • 1
Curiosity... Which comic feeds were suspended?!

Eh, about 2 dozen of them have gone over the past year and a half. The most recent popular one is Foxtrot. Boondocks was pretty much the first to go, and Dilbert was probably about 6 months ago. There's typically a bunch of other smaller ones that have gone along.

LiveJournal will suspend anything that's being syndicated without the copyright owner's control, if the copyright owner writes in. However, if it's your own content being syndicated (think weblog), you need to block it via technical means (think blocking LJ from loading it).

So, scraped feeds can get gone pretty easily. They tend to come in clumps: a lawyer representing a bunch of them will contact LJ. During the last big push, there was some mention of the fact that LJ was working out a deal to get some of them syndicated legally, but it seems it didn't go anywhere: at least I haven't seen anything about it.

http://crschmidt.livejournal.com/177675.html that was the original list. There were a few more after that, but you can see that a fair number are gone.

Yes, quite a few. Did you use some sort of handscripted way of scraping each feed, or was there an app that you used, such as jwz' stapler (or whatever he called it)?

Seems to me that scraping would be a lot more commonplace (and probably a lot more accepted) if the barriers were negligable for anyone to do it. Windows apps that would intelligently scrape, suggest possible variants, ftp the feeds to a given location, and maybe allow for a way to share the scraped feeds with others, for instance. The scraping web apps all seem to generate feeds in a uniformly horrible, expurgated, and commercial manner.

It's an idiotically simple perl script, which breaks with the slightest change. Cheesegrater, which is what jwz's thing is called, is *way* more robust, and even he admits that it's still kind of ugly, and breaks easily. At the time, my Perl skills were pretty suck.

Systematically reading and taking content from a webpage is not going to be considered polite for quite a while now, no matter how easy you make it. There's too much to be lost in terms of ad revenue, and the first guy who makes a program to do it that gets popular is probably going to be sued, even if he does eventually win the fight.

*nods* Sued? Perhaps. Open source apps are hard to sue, fortunately. Trying to sue Bittorrent is going to be infamously difficult, as there are multiple legal uses for it.

Maybe a theoretically popular scraping application would have to be built into a larger RSS reader app, and would have to do similar cool things with regular RSS and ATOM feeds, so as to not be strictly there for scraping purposes.

One way to get around LJ deleting feeds would be to have an app which mirrors RSS feeds to a given LJ account. Keep 'em friends only and nobody would ever know.

(Deleted comment)
Some of them I used to pull from updating. I haven't recently, mostly because I just haven't thought about it. If you're reading them in an RSS reader, they should still work.

And I still disagree with the idea of adding a million and one RSS namespaces, but really can't be bothered arguing :)

And you're still wrong ;) And I can be bothered arguing, but I'm aware you wouldn't be convinced, so it would be pointless. ;)

  • 1