So here we are. As promised at the end my GF 80mm ƒ/1.7 lens review, I've received the Fujifilm Instax Share SP-3 printer and 8 film packs totaling 80 exposures. I've never owned or printed using Instax so I decided to use one pack of film to print a full spectrum of recent shots; a couple of portraits with 2 printed directly from the camera, monochrome, vivid color, high contrast, golden hour and my personal processing style. I wanted to get a good feel for the film's dynamic range, color, contrast and sharpness. While the film is traditional instant pack film, the printer works by projecting the image on an 800 x 800 dot, micro-OLED display that exposes the film, basically a digital form of the optical projection used to expose pack film in a traditional Polaroid Land camera. Because it's not fully optical and the micro-OLED is so, uh, micro, I wanted to see how far I could stretch it so I can make adjustments to my shooting style for the best results.
The M1 does what Apple promised during the PowerPC transition 15 years ago but failed to deliver. What you hear coming from a millennial's face on YouTube is not hyperbole; the M1 clearly has processing power to spare since its able to translate/emulate non-optimized code at a speed where users don't even realize there's an extra computing layer in-between. In many cases, it's actually faster. Let me repeat: the M1 is running non-native code, through an emulator, faster than an Intel chip running the native code, natively, and is doing it with substantially less power consumption and with no active cooling. If a millennial is shocked at the performance of the M1 when compared side by side with an Intel equipped MacBook, they can't even imagine what I, and others my age or older, are experiencing. Our minds are completely blown. For those who aren't as tech savvy, a metaphorical example: imagine 2 people trying to read a book in Japanese. One of them can read Japanese while the other doesn't understand Japanese at all but has a device they can use to translate it for them. What's happening here, basically, is the person who needs to use a translation device is able to read the book faster than the Japanese person is able to read it directly. We witnessed first-hand this sort of wholesale transition before from the very same company. We cursed Rosetta. That was a hard shot of reality after being massaged with marketing hype, promises and a near total failure to deliver. This also came on the heels of the painful transition from Classic MacOS 9 to the Unix-based OS X, where little was offered and even that didn't work well. Once we get past the fact that we have software running in emulation at a pace that's faster than the same software being run on native hardware, we are then confronted with the fact that it's doing it cooler and more efficiently. The M1 runs harder for longer and with much less energy consumption and our imaginations are running wild at the prospect of just how much more performance we'll get once our entire workflows are coded to run natively. Even faster(?!?) and more efficiently, possibly gaining as much as 50% more battery life once Rosetta 2 is eliminated? We can't even fathom it. Hell, most of us can't even fathom what Apple has already delivered.
The ZX1 is a $6000 time bomb set to go off in 2 years. It won't even be sellable in 2 years, much less for even half the price, as the hardware required to run the software will be considered "legacy." Zeiss will have to scrap and refresh the internals every 2-3 years to keep pace with the smartphone industry that ARM, Google and Adobe are tied to. I'm afraid Zeiss has failed to truly consider all of this; Samsung, who has their own smartphone division, tried this years ago with the Galaxy NX line of point and shoots and APS-C mirrorless and scrapped it when faced with the choice between trying to sell outdated cameras for a profit versus annual camera refreshes that made them unprofitable. Ultimately, the product cycles of digital cameras and smartphones were just too dissimilar to be profitable. If I'm not buying one for $6000, and you're not buying one for $6000, who's buying this camera?
As you may know, Fujifilm offers an AC adapter for the GFX series and it's priced at a whopping $97. However, there's another option for AC wall charging as long as you know the power specs.
A more versatile option is a power bank with pass through charging. You can power the GFX with the battery pack in the field while simultaneously charging the installed battery. When you’re near an outlet, you can also connect the power bank to AC power and continuously power the camera without depleting the power bank itself. By functioning as an AC wall adapter you won't need to buy a wall adapter specifically for the GFX while having all the benefits of a portable battery pack.
Coming soon from Apple is the recently announced Magic Keyboard for iPad Pro. Like before, it's a combined protective case and keyboard, but this one will have 2 dampened hinges and a secondary USB-C port for passthrough charging. The headline feature though is the inclusion of a trackpad and accompanying, underlying changes to iPadOS and the UI to accommodate the use of a pointing device for navigation. Basically, the iPad is becoming more laptop-like since the diversion from a unified iOS code base to separate iPadOS and iPhoneOS. Now the hardware will begin to reflect that change.
Today, despite watching others do this for years, I've finally started using it for its intended purpose: modifying photographs to create images that don't exist in real life.
I guess you can infer by my tone that I'm not a huge fan of photo manipulation, and you'd be correct. There's a fine line between photography and art and I feel wholesale manipulation of the image to create something that cannot be captured in whole, within the camera, as dishonest. However, I draw that line at profitability. If you're profiting from a reputation as a photographer while creating digital art and misrepresenting it as a photograph, I take issue with that. If you're creating art for the sake of it and representing it as such, for profit or not, I have no problem. The gray area is of course the line between reality and art. What I did, while photorealistic, is what I would classify as art because you couldn't recreate my result in a single photograph.
Maybe you've heard of Fujifilm's mostly ignored software companion, X Raw Studio. It was released sometime after the X-T2 and advertised to leverage the power of their X-Processor Pro image processing engine, aka onboard CPU, to post process your photos on a desktop or laptop computer. It did this by connecting your X-Pro2, X-T2, X-H1, or X-T3 via USB 3 or USB-C's superspeed bus and would allow you to edit your RAW files on a computer but would leverage the high speed bus and X-Processor Pro's power to process the images. Since it's release, it's sat collecting dust with only minor bug fix updates since, while Fujifilm has established partnerships with brands like PhaseOne's CaptureOne and Skylum Luminar to natively support the X-Trans system. All of this seems to be the result of traditionally poor support from Adobe, the long-standing leader in the industry. But I have a vision.
Found this as I was going through my junk drawer and tossing out what's now trash or recyclable. It's an ElementCase Vapor Dock for iPhone 4 and 4S, like new and still with the original box. Seeing as how virtually no one in the first world still uses an iPhone equipped with a dock connector, into the recycling bin it goes. That's a solid hunk of aluminum right here, with various bits of stainless steel thrown in. A lovely bit of machining, now pretty much junk.
High contrast scenes tend to work well when processed in monochrome format. I'm especially lucky since Fujifilm's film simulations are such great emulations of their popular film stocks. Their Acros simulation is especially good with high contrast, moody scenes and I've been processing more and more of my landscape shots with it.
A few of the high contrast shots I took while in Forks took especially well with the Acros film simulation.
After a day's break from the trip to Forks, Craig and I took advantage of a break in the rain to hit Roozengaarde in Skagit to photograph the tulips before the festival began. About half of the tulips were in bloom and the daffodils were still out, though they looked ready to wilt. Fortunately the weather and time of day seemed to keep most people away. This also gave me a chance to use my Leofoto tripod on different terrain. Again, things just happened to work out for us as the rain held off and the clouds helped give the photos a dramatic, almost ominous look that contrasts with the burst of colors below. Too bad the stiff breeze prevented any chance of getting a longer exposure, but that's fine. For tulips, it's all about the colors.
Day 2 at Second Beach. Unfortunately, this time we hit high tide and quickly discovered the beach was a lot less interesting when the tide is in. We managed to make do, despite shooting over 200 shots, again, trying to chase waves that weren't hitting the shore as intensely as they had the day before. At least having my tripod allowed me to get some long exposures as the clouds were moving on shore fast and thick.
Funny how things tend to happen in spurts. I spent the past week in Forks and Mount Vernon to get some camera work in. Craig and I went to Second Beach on the Quilayute reservation to photograph the seastacks just off shore. After a day of recovery, we went to Roozengaarde to get some early shots of the tulips before the Skagit Valley Tulip Festival. Despite the weather forecast predicting rain for the week, we were fortunate to get some breaks in the rain that were long enough to get all of the shots we had planned for, and then some.
Went to Discovery Park's West point Lighthouse to observe a confluence of events: king tide, strong wind gusts and gradual clearing of skies. The hope was to get waves crashing near the West Point Lighthouse. Unfortunately, the tide began to recede quickly and by the time the light was good, the waves could no longer reach the point. The result was a bunch of mediocre photos that I decided to use for practice in Lightroom instead.
Maybe I can sell a couple of these to a church for use as flyers or book covers or something.
Cold, wet... a lot more rain fell than originally expected when I left the house. It was an absolute mess but Fujifilm proved to me how well they sealed the X-T3. Combined with the 16-55mm ƒ/2.8, the combo remained water-tight in steady wind and rain with no attempt at protection. Because of the weather, I was obviously a bit low on inspiration, but figured I'd post what I got for the sake of others who took part.
Normally in Seattle, we'll get a couple of inches of snow per winter. This winter, the snow showed up both late and in force, dropping 5 inches of snow on downtown in a single morning. This is on top of a few inches earlier in the week plus a few more inches a day later. As the snow begins to melt from slightly warmer weather bringing rain in the 24 hours since, I've managed to get a few photos processed that reflect the views around downtown in the early hours, shortly after the snowfall stopped.
I walked around downtown at 4am capturing the empty city streets and landmarks between Chinatown and Pier 66. While not very significant compared to other parts of the country, and even the region, the snow was one of the largest single accumulations in recent history for downtown.