I ordered a dolly from Amazon, to help me with the misc packing up of stuff that I’ve been doing. Just needed something to help me more easily move around the boxes, tubs, furniture, or whatever… but it was just as nice to be able to get a big box for the kitties to play in. This is the first time since Maggie got here that I’ve been able to cut some holes into a big box like this and turn it into a “house” for them to mess around in. Glad to report that they both like it and even had some fun playing together. 😊
- Update at the bottom…
All this screwing around that I’ve been doing with the test 360 degree video files… I’m starting to think that the problem isn’t with me, the camera, the app, or the PC software.
There are two common, different ways you can view a 360 degree video. You can use the mode where if the camera was facing north when you started recording, as the car drives and turns corners and such – from the viewer’s point of view, if they don’t touch the video, the camera’s view will always be facing north. So if it’s pointed straight ahead when you start, then make a 90 degree left hand turn – it will look like the camera’s view moves 90 degrees to the right, so it can continue to “face” in the direction that it had started.
The second method, and the method that I’m trying to make work, is called Direction Lock. When you start the video, if this works correctly, one of the two lenses on the camera will be the “main / starting” lens… so when that lens is aimed towards the front of the car, the video will start with a forward looking view. In this mode, if you make a 90 degree left turn, the camera just continues to use the main / starting lens without change – so its view acts as the view of a person who was sitting on the roof of the car, continuing to look forward no matter which way the car may turn.
Surprisingly, there isn’t much that comes up when you google “Insta360 youtube direction lock” other than some tutorials and example videos, and a few random posts and pages that don’t address this particular problem, so I’m kinda just on my own here. Now, in the Insta360 app, the direction locked version plays exactly like it’s supposed to. But no matter how many settings I tweak, or how many adjustments I make, YouTube will only play those same videos in the “floating view” mode. No bueno.
Especially for a 360 video that’s being shot from / on / in a car, our brains basically expect that if we are looking forward, and the car makes a left hand turn, we’ll still stay “looking forward” even though the car has changed directions. The floaty type mode just looks wrong and our brains don’t like it. Now, why doesn’t YouTube support the direction lock mode? No idea.
It used to be that you’d have to use a metadata injector to “prepare” your 360 degree video for uploading to YouTube. That was because their system apparently wasn’t yet ready to auto-detect videos that were meant to be displayed in their 360 degree viewer mode. But now, at least from what I’m reading, most major brands of 360 degree cameras inject enough of their own data into the resulting videos that YouTube does recognize them by default, and therefore displays them in their 360 viewer.
I’m not sure the recognition goes any further than that though. It could be that YouTube simply sees the metadata that says “this is 360” and doesn’t bother looking at any of the other metadata to see if there’s anything else that it needs to do. Because the videos themselves aren’t actually altered when you play around with the view in one way or another… it’s the metadata in them that tells the viewer what to do and when. I’m like 93.7% sure about that. And I think YouTube just ignores it.
I’ve got one more set of test uploads to try, and if none of these are successful then I’m going to consider my theory correct… but then I’m not sure what to do. My project, whether I actually finish it or not, kinda relies on having that direction locked way of viewing them. I suppose the next step will be trying them on another platform. I haven’t really used Facebook for well over a year, but I suppose I can set things to private and upload the test videos there. If they’re a success… well, I’m not sure what I’ll do.
But yeah, as you can see, this has been keeping my brain busy, so that’s a good thing, right?
EDIT: Great googily moogily, I did it. One of my test videos uploaded to YouTube is actually playing the way that I wanted it to. Stays locked to the forward facing view of the car, so when the car turns left, the view turns left. Car turns right, view turns right… working just like a passenger in the car with their eyes locked forward. But then of course you can still scroll around in whatever direction you want with your finger, or with your mouse if you’re on a computer, but yeah… now to see if I can recreate it with all three videos that I shot the other day.
I started yesterday’s time-lapse recording before the sun had set, but the cloud cover obscured a good portion of the sky… enough that I caught fewer airplanes passing through the frame than the night before. (Despite the video being longer.) Looks like tonight will be even worse, cloud-wise, so I probably won’t even make an attempt. Next thing I wanna try is the .5x wide angle lens, even though it doesn’t collect light as well as the main lens. Might take a little more fiddling to get a decent result, but that’s half the fun.
(Click the bottom right corner to increase to highest resolution and expand to full screen.)
My previous time-lapse video of the stars was meant to not only capture them in a way that showed obvious motion, but also to capture the background “dust” of stars along with the brighter, more prominent ones. This also had the side effect of really illuminating any clouds that passed by, plus any airplanes or shooting stars that went through the frame were so quick that they were difficult to notice. That video and method did what I intended, but last night I decided to do something different.
You’ll need to turn your brightness all the way up, and possibly wait until your room is mostly dark, in order to see what I ended up capturing. It doesn’t help that YouTube’s compression has filled the video with artifacts and essentially erased many of the dimmer stars, but you can still see what I was going for.
(Expand to full screen, switch to 4K, and maybe drop the speed by 50% for best results.)
Last night I shot each frame of video as a 4k, one second exposure, with the highest ISO that the iPhone 11 allowed. That on its own wouldn’t really do the trick, so I also applied a method of shooting that artificially extended the brightness of anything that moved through the frame. With an additional 30 seconds of “light retention” it allows a time-lapse video to be played at 30 frames per second, resulting in “enhanced” light that follows the original object in real-time for one second.
That allows the much higher resolution time-lapse video to move at a pretty good clip, while still making any moving object much more obvious to the eye. And in this video you’ll see several airplanes, which are unmistakable as they travel across most of the frame – however you’ll also see shorter, and possibly quicker, streaks of light that only happen in a small portion of the view, which are quite likely “shooting stars” from the Perseid meteor showers that are happening over a few days right now.
I like how the airplanes look, so I think tonight I’ll start shooting one of these videos before it even gets dark dark… hopefully catching a few more planes as they pass by, since not only am I already in a low traffic area, but the later it gets, the fewer flights that will be passing overhead. I’m also going to increase the enhanced light, doubling what I was doing last night, so we’ll see how it turns out. I’ll probably have to also adjust the brightness and contrast of the resulting video, something that I also did for last night’s video, but I think it’ll end up being another unique looking video shot from here at the fortress of solitude.
Continued my testing last night since I knew that I was gonna be awake into the wee hours… standard iPhone 11, shooting 15s exposures @ ISO 500. The partial moon, being slightly below the visual horizon for the first bit, helped to light the slow-moving clouds while still allowing the stars to be captured in the distance. Once it made it above the tree tops, though… that was the end of the night’s experiment.
(Click on the bottom right corner to make sure it’s set to 1080p before expanding to full screen.)
Killer headache at the moment, so I’m gonna refrain from making a real blog entry… but I figured I’d throw this video up from my dashcam in parking mode from earlier today when I had to spend more time in town than I really wanted. Clouds were okay, but this would be much better in more severe weather. Not that I wanna be out and having to park in the open when shitty weather hits. 😏 (Best in HD of course…)
I stayed up until around 2am last night. Mostly just because I couldn’t fall asleep, but also because I wanted to step outside and see how many meteors I could see from the Lyrids meteor shower. 🤓 I stood outside for a little over ten minutes, freezing, and saw a grand total of none. 😅 I know that I was looking where I was supposed to, so they were either lighter than expected or I was just blinking at the wrong time. 🤷🏻♂️
No big deal, since if I had seen several of them I probably would have been wishing I had figured out the needed settings for my cameras so that I could have captured them. 😒 But earlier in the evening I did mess with the phone and camcorder a bit, to see if I could quickly figure out how to get a noise-free, well exposed photo of the stars in the sky, while also being able to capture any shooting stars… and it’s not as easy as you’d think, at least not with an iPhone.
It’s funny, I’ve got it down where I can get a decent exposure of the night sky, but I know from trying to also capture a few passing planes in the frame – that the way I was exposing the image requires the stars to “stay put” for about 30 seconds, and any moving source of light just wouldn’t be captured. 🤔😐 Meh… I’m fairly confident with my regular star exposures, so if the sky is clear tonight I might plug up the phone to power and leave it out back to possibly get a decent time-lapse. On a full charge, using just the internal battery, I was only able to get one hundred 30 second exposures, forming this bitty 10 sec time-lapse… 🤷🏻♂️🙂
(As usual, best viewed in HD, full screen… and this time, in a dark, dark room. I’m gettin’ there…)
So it was after 2a before I went to sleep, and then I woke up around 7a because my brain knew I had a doctor appointment at 9:30a – so it was sleeping lightly, waiting for any excuse to make sure that I’d wake up and not sleep through it. 🙄😏 Obviously I’ve had early doctor appointments before, but this one was different, so I’m certainly not complaining that I had to wake up early for it today.
This was my first experience with “telemedicine” as I guess it’s called. 😃 Staci called a little early, at 9a, to see if I was awake and able to go ahead and start… and she then sent a link via text, I clicked on it, it opened a page that started the AV connection as fast as a regular Facetime call would start, and from that point on it was like a regular doctor appointment. 🤓👨🏻⚕️
She did her normal “pre-doc” stuff of all the typical information that they gather before the doctor actually comes in, she then put me on hold for about 30 seconds, and then my doctor clicked back in and we wrapped up my appointment in less than ten minutes. 😊 Still covered everything that we would have covered in person, and in fact – because I didn’t have to experience the anxiety of the in-person visit, I probably felt better than at a “normal” appointment. That also meant that I didn’t ramble on about any minor “this or that” which often happens when I’m actually there and able to bitch. 😅
So yeah, I could definitely get spoiled by something like that. 🤗 I realize though that a doctor does have to be able to physically interact with a damaged person at least every couple of appointments (to confirm the level of their damage) but it would be nice if I could do two of these, then a regular appointment, two more of these, then a regular, etc. 🤷🏻♂️😕 Unfortunately they probably aren’t able to bill insurance at the same rate when they do “virtual” vs “real” appointments, so once they feel that the COVID risk isn’t as high anymore, everyone will be going back to the regular routine. 😒
Heh… this is probably my longest, most coherent pre-10a entry I’ve ever written. 😋
My sleep has been screwed, so I was actually awake before dawn yesterday… and since my iPad was fully charged I decided to throw her out back to take a time-lapse video for as long as she could. 🤔🤞🏻 Shooting one frame every five seconds, she was able to last for nearly seven hours. 😃 It starts around 6:30a, just a little after dawn, and does eventually go from “bleh and cloudy” to “half-way decent” before it’s over.
(Like all of these time-lapse and sky/astronomy themed vids, it’s best viewed in HD and full screen.)
And I know, it’s essentially the same thing I’ve been doing over and over again here at the house… 😏 although this one was more of a test of the battery. 🤨 So at least now I know that if I set it out there and have one of those booster battery packs already plugged in, there’s a good chance that I’d be able to do a dawn to dusk time-lapse of a single day without it having to be plugged in. 🤓 The trick then would be figuring out which day would have dramatic enough clouds without also having pouring down rain.
Meh… all of this stuff is kinda like my brain just tolerating eating generic, frosting-free “toaster pastries” when it would really prefer the brand name, frosting covered, goo stuffed, delicious Pop-Tarts brand toaster pastries. 😅 ie: I still wanna be creative, but I have a hard time motivating myself enough to get away from the house to do it. 😒 But hey, this one gets kinda pretty around the middle part, so there’s that…