After carefully considering the issue, Apple is now allowing applications to use the function UIGetScreenImage() to programmatically capture the current screen contents. […]
A future release of iPhone OS may provide a public API equivalent of this functionality. At such time, all applications using UIGetScreenImage() will be required to adopt the public API.
This annoys me. A lot. Not because I don’t like the idea of issuing derogations for using this or that function from the private API (which I don’t), but because the reason why everyone wants to use UIGetScreenImage is that it’s currently the only way for an application to capture video and process it on the fly (or to capture video at all on anything but an iPhone 3GS). And I don’t really mind stopgap measures per se, but it scares me to see Apple forum moderators addressing the future issue of having to use the upcoming official screen-capture API, rather than the real problem of not being able to access live video capture from the camera sensor… when the UIGetScreenImage produces awful three-frame-per-second 160-pixel video, and apps for a jailbroken iPhone 3G can instead make somewhat fluid, usable video — because they don’t have to use that terribly inefficient workaround of displaying video, capturing the screen, and encoding it themselves, all on a pretty limited CPU.
Sure, just because Apple doesn’t mention an upcoming video streaming API doesn’t mean it isn’t coming, but I can easily see them being content with the access they just opened for capturing screenshots. And, if or when they finally do give developers a direct access to video capture, I can even more easily imagine that they’d still restrict it to the iPhone 3GS.
And it pisses me off because there’s no good reason for that, and because I’m not going to be able to upgrade until next summer at best. (Well, I might be able to, but even if I got a huge contract next week it would be stupid to upgrade my iPhone this late in the product cycle.) I have kitten videos to make!