How good would the experience be with a linux phone and an external camera?
I’ve got a pixel 6 and although camera’s are getting better each year, it’s not even close to a dslm. And video qualit is probably better with a proper action camera.
I mean, directly “mounting” the camera to the phone and shooting with the phone.
i feel like this would just be better served by having a phone and a camera. a good large camera will continue to be a good camera for years and years past the time the phone is too old to be useful for modern needs. my almost 20 year old DSLR still outperforms my phone camera, and my phone is quite recent.
the general Idea I am getting from this would be something where the phone itself could be swapped out
Most nice modern cameras have either USB, bluetooth, or even wifi connectivity to connect to whatever you want. You could just have a normal phone and a normal camera and just copy the files off via whatever method you prefer.
That’s exactly why I want to replace the phone camera and with linux there’s basically endless possibilities to deal with it
my point is that you might as well just have the phone be separate at that point. instead of having to frankenstein them together just have two devices. also, last i checked the linux experience on a small handheld device is not something you’d want to subject yourself to daily. android is much more what you’d want.
Excluding some of the smaller point and shoots, which are still more volume than most phones, DSLRs and Mirrorless cameras are way bigger than phones for a reason. It’s because that’s what it takes to take actual high quality pictures without cheating heavily with processing.
Would recommend using an external camera to be honest.
There is a ton of software needed to get the most out of a camera, and from the little I understand about embedded image processing a lot of it happens inside proprietary blobs. You can get the image directly as an alternative, but it will look like garbage without reprocessing the input (preferably inside an open source component, with the downside of sometimes being unable to use the hardware to accelerate this)
Right now if you wanted a high quality, mostly open source Linux device with a camera, IMO you’d be looking at the Raspberry Pi, and there is still a ton of work to do. The work being done there, as well as Libcamera, the V4L2 replacement for MIPI/CSI cameras, should eventually make its way into Linux phones - but no idea when that will happen
I thought about tethering. I’ve played with tethering in the past for astrophotography which is working alright.
I don’t want to mount the sensor directly to the computer - that’s impossible for a camera noob like me.
I mean, directly “mounting” the camera to the phone and shooting with the phone.
This is pretty standard on most decent cameras, although it’s usually used with the camera and phone separate. Photographers will set up a a camera on a tripod and use a phone or laptop to control it remotely. It can be used to control multiple cameras.
The youtube and tiktok generation will mount the phone to the top of the camera, usually using the flash mount, and face it forwards. This way they can see the screen while they’re facing the camera, and be able to see the framing of the shot while they’re shooting.
The biggest problem you’ll find is that the phone apps are designed for Android and Apple, or maybe Windows Phone. I haven’t used a Linux phone, so I don’t know if they run their own apps, or if they run Linux programs. If they run Linux programs, then it’s just a case of finding one that controls your specific camera, and has the controls that you want.
Thx for the input! I’ll research in the direction of it further more - maybe first with android in mind.
so I don’t know if they run their own apps, or if they run Linux programs. If they run Linux programs, then it’s just a case of finding one that controls your specific camera, and has the controls that you want.
we can run linux desktop, linux mobile or android apps, but camera support in waydroid is broken for a while when using v4l2
this is possible in theory, libcamera can expose all of the bits that are needed, have fun actually finding hardware to support this though
Tbh I just carry a decent point and shoot canon camera with CHDK on it for my photography needs. Granted, that’s because I went from an android phone to a cheap kaiOS flip phone, bit the point still stands
Just a thought, but in a few years all old ugly photos can be refined, upscaled, content edited, rotated and animated - in 32K ultra. It could even recognize the exact mobile model a random photo was taken with and pre-set the best filters.
It prolly won’t matter much if the photo is taken with a hundred year old handheld plate camera or a brand new digital mounted one - it will look great regardless.
Are you sure photo hardware is the way to go ? I think I would just use whatever you already have and upgrade the pictures later when the software allows it.
There is really only so Mich software enhancement can do. At a certain point, there’s not enough data to interpolate.
Upscaling isn’t really the holy grail
And it can’t definitely make up for the subpar image stabilisation of the pixel.
I really doubt that. Computational photography is only as good as it is because of how heavily it processes all sorts of data that can’t make it into the jpeg that gets spit out.
I would love to see what an Apple camera with the hardware they leverage on iPhone, but a full frame sensor and real lens could do, because what they manage to pull out of the trash ass input is impressive. But it’s already processed to absolute hell. There’s nothing left for further passes to pull out.