Category Archives: Uncategorized

Configuring Nginx to reverse proxy .NET Core on Mac

Moving on from developing .NET using Visual Studio Community Mac, I started working on the necessary configuration for actually running on a Mac host (and by extension, a non-Windows host).

The main task here is to configure Nginx to assist Kestrel (the .NET Core web server). Kestrel currently excels are running the .NET Core application but is not fully-featured enough to do a good job with other things, like security and assets like images, hence the requirement for using two web servers.

The setup for Nginx to run in this reverse-proxy configuration is pretty straightforward.

MAMP configuration

I use MAMP to package up Apache, MySQL, and Nginx for development work, and it helps keep things fairly painless to configure. However, the basic configuration for MAMP does not allow Nginx to reverse proxy a custom address and port, which is what’s needed to direct requests to Kestrel.

Configuration Steps

The solution is to edit the template that MAMP uses to actually generate the nginx.conf file. Here are the steps:

1. In the MAMP menu, select File | Edit Template > Nginx (nginx.conf). This opens up the template file used to actually generate the nginx.conf. You’ll see various predefined macros in red.

2. Add a section to the template file and leave the rest alone – I chose to add a new “server { }” definition inside the “http {” section. and above the first existing “server { }”  definition. This adds a new listener on port 9999 and passes on all queries to Kestrel listening on port 5000.

server {
  #listen over http on port 9999 on localhost
  listen 9999;
  server_name localhost;
  #Kestrel server
  location / {
    proxy_pass http://127.0.0.1:5000/;
  }
}

3. Start (or restart) Nginx through MAMP and all should be well. The actual nginx.conf file generated by MAMP using the above template can be found in /Library/Application Support/appsolute/MAMP PRO/conf/. It may help to double check this config file to make sure your changes are correct and being seen by Nginx.

To run your solution, simply run Kestrel and your .NET Core app through Visual Studio, or by using the dotnet run command line instruction inside your project folder.

Nginx will now proxy all calls to Kestrel. Open up your site on  localhost:9999

There are further optimizations you can add to the nginx.conf file. Foremost is to configure static file requests to be handled by Nginx rather than Kestrel for performance reasons.

 

Job Interview 1: Shiny!

I’ve been interviewing with several large companies for a  role in software development management. I got through to the panel round of interviews with a nationwide retailer to work with their e-commerce presence.

Prior to this, I did a little homework and did what I often to do gauge the maturity level of a company’s website, looking for any obvious issues, the kind of technology used, and evidence of UX/UI consciousness.

I found some glaring issues, such as terrible page load times stretching up to 8 seconds, which is a huge no-no. In addition, page analytics from GTMetrix gave it a failing mark, as did Google’s page speed insights. A product page took a whopping 180+ HTTP requests to load — a number I’ve never seen before (most sites keep it to under a third of this value).

All are red flags that indicate the need for attention – the page will take time to load, causing a penalty on Google, not to mention customers will drop off; and the extra load on servers would potentially cause scalability problems.

The interview was with my future (non-technical) boss and with several of their existing web front-end developer team members that would have been my subordinates. After the normal pleasantries, the developers proceeded to fixate on my thoughts about the latest in front-end technology, rattling off a list of technologies and whether I had used them or not (some I had, some I hadn’t).

I stated that frameworks and technology change and by necessity, there is always a need (and desire by developers) to keep trying new frameworks, but there are some issues with the frameworks that exist today that need to be understood in the context of the larger whole.

These issues have to do a lot with the size of frameworks. It is too easy to pull in an entire framework to do something as small as styling a button (e.g. Bootstrap) and now you’ve impacted your page load speeds by some fraction of a second.

One interviewer was critical about my experience with one of the older UI frameworks that we used in my previous projects. But a framework is just a framework – a simple way to avoid the tedious Javascript programming needed to pop up a dialogue box or a panel or a lightbox – there really isn’t much magic to it. There is nothing that a framework provides that cannot be achieved by a good programmer.

Some frameworks take a complete ground-up redesign from one version to another, rendering prior knowledge completely useless, or even dangerous. Interviewing or hiring on the basis of knowing Angular 3.0 or this or that is silly. You need to figure out if the person you’re hiring can adapt to new technology as it comes out.

I went on further to emphasize that a lot of front-end design is necessitated by having to bow down to Google’s presence, and that several technologies are currently incompatible with good SEO – SPA (single page applications) built on frameworks like Angular are terrible for SEO, and not a good candidate for building out sites that benefit hugely by having their catalogue pages indexable for customers searching for products.

I went on to say that a single page product view is enhanced by bringing in other critical customer-facing information, such as in-stock information through better investment in backend and web service systems and established means such as web service and AJAX calls to bring information to the user; and that the latest UI frameworks, while fun, don’t replace the need to deliver these fundamental features, when the goal is increasing conversions.

The feedback after the meeting: I did not know enough about the latest technologies – thanks for showing up, but we’re passing on you. While somewhat miffed, I was actually relieved, because that was not the kind of organization I would have liked to work with if that was the deciding factor.

Lessons to impart:

1. Do not have your developers push a “shiny is better” mentality onto your hiring decisions. Sure, always keep abreast of the latest technology and see how it can apply to your business, or to the efficiencies on your dev team.

Remember that we’re solving problems, not having fun with one pet framework or another. A good, decoupled architecture will allow usage of different frameworks or technologies to solve problems as needed irrespective of technology – this is Amazon’s approach.

But do not forget to keep pushing the fundamentals – page speed, functionality, SEO-friendliness, user experience. Not one of these elements would have been improved by plugging in the latest JQuery/Angular/ReactJS/Polymer/Web Component framework. What is needed in this case was a roll-up-your-sleeves focus on reducing some of the code bloat on these pages. These frameworks may speed up development (great for programmers, probably product owners and stakeholders), but have the downside of slowing down responsiveness for the client.

Somewhere an architectural decision went wrong with this company’s site and needs to be solved immediately.

2. There is a fundamental problem in current interview processes. I’m sure some good candidates will drop out of a process that relies upon a scorecard of technologies. Fewer but harder questions need to be asked, deep-dives into specific areas such as architecture, or more self-awareness and reflection needs to exist in the minds of the interviewers. An interview cannot necessarily be factored down into a bingo scorecard of technologies. Some testing of ability should be there as a minor formality.

The prospective candidate should talk to the team to assess the environment, the challenges, and exhibit some leadership or analytical skills.

Interviewing well is a skill — not only for the candidate, but for the prospective employer as well.

I would have, for example, been happy to whiteboard a few things, understand the situation, maybe outline a potential plan, just given a few specific details on challenges, or heck, a scenario.

This is where you can start gauging someone’s ability to think and problem solve, and get a glimpse of how they would be to work with.

I always itch to be in a room with a whiteboard and pen and to want to jump up and start drawing stuff.

3. Is it appropriate for staff to interview their future manager? Does a sports team interview their future coach? I don’t think so. I think it is a very big mistake contained in a cosy, touchy-feely, inclusive wrapper.

Will staff be amenable to a new manager who has increased oversight when they have been happy with a lax environment? Is there a particular management style that you require from the manager that the staff do not understand? Can the candidate ask frank questions about existing staff issues and things that he/she will be required to fix? The whole issue is fraught with complications.

Did my comments on adhering to fundamentals and being cautious about adopting new technology not resonate with those that are keen to be using the latest, but show a considerable blind spot with regards to the fundamentals of good UX (as proven by the 3rd party statistics)?

4. If you’re a non-technical boss, how do you interview a technical person?  I raise this question as this was probably the rationale for including the team members.

Naturally, I addressed the working style with the future boss, but I sensed a distinct deferment on the boss’s part to the technical folks to conduct the interview. There was very little “managerial” level talk.

I feel a manager in this situation should try to assess in the candidate the ability to keep up with technology and to keep an open mind on adoption, to act as a bridge and enabler between the various departments (marketing, development, operations), and to get a sense of how he/she will manage his subordinates, a sense of communications ability and political tact.

5. Technology moves on. Knowledge becomes outdated in mere months. Leave a manager to trust the subordinates who are in the trenches to play with new stuff and to recommend stuff that is worthy of consideration, and to know what sort of effort would be involved in doing so. Make sure the manager can create this environment. It’s great if the manager can also be somewhat hands-on, but the developers are ultimately the ones who will have to do the implementation anyway, so they need to be the ones with a large amount of input in this.

Regardless of the outcome, it was a very, very instructive process. I wish them all the best of luck, of course, but, boy, I really wish they had done a better job of the interview. Those nasty page load times indicate that lots of work needs to be done, and quickly.

Canon Flash Notes

The Canon flash system has some interesting quirks (or features) that may trip those coming from other systems, especially Nikon.

The biggest confusion is how the camera P and Av modes handle the flash exposure differently and this is the primary intention of this post.

Flash in Program Mode

The P (Program) mode flash behaviour is set up to be like a point and shoot mode; i.e. fire the flash to illuminate the scene properly no matter what happens and deliver decent, non-blurry results.

It will select a balanced fill-flash mode if the scene is fairly bright (13EV and higher) and it will go into normal flash mode if the scene is dim (10EV and lower), with a blend of both between 10 and 13EV. Naturally, the result of these will be very different – the first method will expose the subject with the background lit fairly well, while the latter will favour the subject exposed properly, but against a darker background — this is the standard direct flash look.

The camera will set a minimum shutter speed of 1/60 up the max sync speed in P mode, hence it should freeze most action fine. If that isn’t sufficient, then you need to use a different mode.

Trouble with Flash in Aperture Priority

That brings us to camera Av (aperture priority) mode shooting. Canon’s default behaviour in Av mode is to be in a fill-flash mode in all lighting situations, which can cause the shutter speed in low light situations to drag out to long durations (up to 30 seconds), rendering people photography in low light conditions useless. The camera is in effect exposing just for the available light with no consideration for the supplemental light from the flash.

This is somewhat troublesome default behaviour, at least for me, as I’m normally shooting people in Av mode and need at least 1/60 or faster to freeze movement. And it can certainly catch people that are not expecting this behaviour, including myself the first few times I used flash.

I can see the merits of both approaches, and illustrates the complexities involved in flash photography and making a design decision. Nikon, on the other hand, forces 1/60 for aperture priority mode, and bundles the options for enable longer shutter speeds to a flash setting: Slow sync or Rear Curtain mode.

Luckily, there are two ways on Canon to bring up the sync speed to that critical 1/60s level or faster and still be in control of depth of field via aperture:

Solution 1: Set Flash sync speed in Av Mode Option

The first solution is to force the camera to only select a shutter speed between 1/60 and the max sync speed (1/200 or 1/250 typically). This option on the 1DxII and 5DS R is under the ‘External Speedlite control’ menu option under the camera settings tab. There is a menu option called “Flash sync. speed in Av mode” where there are three possibilities: “Auto”, “1/250-1/60sec. auto”, and “1/250 sec.(fixed)” [the maximum sync will be 1/200 on the 5D series bodies].

The default setting of “Auto” would be the equivalent of enabling slow sync mode and the long shutter durations. Therefore select either of the two non-Auto options to lock the sync speed to a more hand-holdable or subject movement friendly range.

Solution 2: Manual Exposure

The other solution is to work in manual exposure mode coupled with the flash in an automatic exposure mode (Ext.A, Ext.M, or E-TTL on the 600EX). I personally prefer this way of working.

The camera manual settings will determine the amount of ambient light contribution to the image and action-stopping potential. The flash settings will determine the amount of power output by the flash. Both Ext modes on the 600EX use the sensor on the front of the flash (a.k.a. the external flash sensor in “Ext”) to measure the scene while the E-TTL mode uses the full features of Canon’s E-TTL / E-TTL II system.

For wedding or event work, I will normally work in manual camera + ETTL on the flash for changing situations. I’ll typically dial in a suitable shutter speed, usually 1/125 or faster, and adjust the aperture to set how much ambient light I want, and let the flash do the rest.

For controlled studio situations, then manual camera + manual flash works for full control and consistency from shot to shot.

Tip: Working with Auto ISO

Auto ISO can confound your best efforts to balance ambient and fill, so I find it best just to turn it off when using flash. In fact, for flash, you may as well go full manual mode for exposure and ISO.

This was the case when I was shooting Nikon – Auto ISO would float the ISO all over the place with flash and nothing really seemed to look right unless I turned Auto ISO off on the Nikon. The reason is the camera doesn’t know what it should do – should it meter and set ISO to get a proper exposure for the scene no matter how dim, or should the flash be the primary light source? Note that this behaviour is also different across Nikon cameras – some will drop to the lowest ISO with flash attached, some will go up to the highest. So again, it’s probably good advice to go manual ISO on Nikon unless you know what your camera’s going to do. Picking an intermediate ISO like 400 will give you good results – not noisy, and you get a little more ambient light than going with the camera base ISO (usually 100).

The good news with Canon is that the Canon bodies I use will fix ISO at 400 when a flash is installed and turned on with Auto ISO enabled. This is a handy feature that saves you remembering to disable Auto ISO when using flash, and you can still override ISO manually if you want a specific look. I am happy to push the ISO up to 1600 or more to gain more ambient light.

Bottom Line

Of course, If you truly want pro results, I feel you’re better off shooting manual at least on the camera, and of course getting the flash off-camera!

Framework Bloat and Missing Fundamentals

I’ve been interviewing with several large companies for a new role in software development management. I got through to the panel round of interviews with a nationwide retailer to work with their e-commerce presence.

Prior to this, I did a little homework and did what I often to do gauge the maturity level of a company’s website, looking for any obvious issues, the kind of technology used, and evidence of UX/UI consciousness.

The Issues

I found some glaring issues, such as terrible page load times stretching up to 8 seconds, which is a huge problem. In addition, page analytics from GTMetrix gave it a failing mark, as did Google’s Page Speed Insights. A product page required a whopping 180+ HTTP requests to load — a number I’ve never seen before (most sites keep it to under a third of this value).

All are red flags that indicate the need for attention – the page will take time to load, causing a potential speed penalty on Google, not to mention customers will drop off; and the extra load on servers would potentially cause scalability problems.

The interview was with my future (non-technical) boss and with several of their existing web front-end developer team members that would have been my subordinates. After the normal pleasantries, the developers proceeded to fixate on my thoughts about the latest in front-end technology. I’m repeating my thoughts in this blog post.

I stated that frameworks and technology change and by necessity, there is always a need (and desire by developers) to keep trying new frameworks, but there are some issues with the frameworks that exist today that need to be understood.

Framework Code Bloat

These issues have to do a lot with the size of frameworks/libraries. The interviewer was critical about my experience with one of the older UI frameworks that we used in my previous projects. But a framework is just a framework – a simple way to avoid the tedious Javascript programming needed to pop up a dialogue box or a panel or a lightbox – there really isn’t much magic to it. There is nothing that a framework provides that cannot be achieved by a good programmer – just with a lot more time and frustration.

The downside of frameworks is the amount of code required to include a framework, especially if no slimmed-down versions exist. These will typically include a call to the server or CDN to pull the framework javascript includes, plus a call to pull the related CSS and sprites. These calls can introduce extra delays in processing the page, increase browser memory usage. One blog article showed 90% of CSS not being used in the Bootstrap demo pages.

Single Page Applications

I went on further to emphasize that a lot of front-end design is necessitated by having to bow down to Google’s presence, and that several technologies are currently incompatible with good SEO – SPA (single page applications) built on frameworks like Angular are terrible for SEO, and not a good candidate for building out sites that benefit hugely by having their catalogue pages indexable.

I went on to say that a single page product view is enhanced by bringing in other critical information, such as in-stock information through better investment in backend and web service systems and established means such as web service and AJAX calls to bring information to the user; and that the latest UI frameworks, while fun, don’t replace the need to deliver these fundamental features, when the goal is increasing conversions.

 

Don’t Forget the Fundamentals

But do not forget to keep pushing the fundamentals – page speed, functionality, SEO-friendliness, user experience. Not one of these elements would have been improved by plugging in the latest JQuery/Angular/Web Component framework. What is needed in this case was a roll-up-your-sleeves focus on reducing some of the code bloat on these pages. These frameworks may speed up development, but have the downside of slowing down responsiveness for the client.

Using tools such as Google Pagespeed Insights, GTMetrix, the performance profiling developer tools in Chrome, Safari, and Firefox, and making the necessary adjustments to the HTML and server-side are key to helping provide that great first visit impression – before the user has even started interacting with the fancy libraries and frameworks – or, even more fundamentally, showing up in a search in the first place.

 

Regardless of the interview outcome, it was a very, very instructive process. I wish them all the best of luck, of course.

Welcome Back!

After spending a considerable amount of time on my company blog (studioimpossible.com), I’ve decided to create more content here that’s of a more unfiltered, personal nature that would be of less interest in those seeking weddings and photography.

You’ll find musings on software development, personal projects, various rants and raves on customer experience — all things dear to me.

Since this was a forced upgrade (no thanks to GoDaddy!) without an opportunity to transition easily from my previous blogging engine, there will be some broken images from old content.

I hope you’ll like what you see!

Martin

13,000 Days

I attended a great session with wonderful photographer Sandy Puc’ this Sunday at WPPI.  Her founding of, and work with, the Now I Lay Me Down To Sleep organization is both heart-breakingly sad (I wasn’t able to go through the videos at www.nilmdts.org without getting teary) and inspiring.

She mentioned that at one time she had calculated that she had 13,000 days, or about 35 years, left to live given an average life span.  Of course, nobody really knows if something unfortunate could happen any time from living out that full time.  I’m not that much younger than she is either…so it’s reality check time!

What if days were dollars?  How quickly could someone blow through $13,000?  Would we carefully guard the remaining money, doling it out sparingly?  Or would we squander a few here, a few there, by doing not much of consequence?  Yet that’s what we do in the tiniest of increments each day with the time killers that pervade our life.  Should we treat people who steal the precious seconds and minutes of our time (and by extension, or lives) the same as if they had stolen money from us?  Telemarketers, email spammers, I’m looking at you….

Perhaps we do need that countdown clock, ticking away, a gentle reminder that while most of us have time to spare, life isn’t something to be wasted doing things you don’t enjoy, hanging out with people that you don’t like particularly, or nurturing resentment or guilt.

Maybe a limited lifespan is a gift, something that motivates us to accomplish great things with our lives.  It’s a wake-up call to dust off those dreams and pursue them with the same zeal as a person with a limited time left.

D3 vs D3x

Indirectly through Nikon Canada I had a chance to have a D3x over Christmas.  There have been endless criticisms of Nikon’s pricing on the D3x, a whopping $8000 US, or $9450 CDN, and I won’t touch upon that here (much).  What I wanted to analyze for myself was how it compared to the D3 at the same resolution: noise would likely be increased due to the smaller sensor sites, but a downsampled image would reduce noise as the individual noisy pixels would be averaged out.  So how would these two opposite factors work out in the D3x? Would the resampling be able to keep the overall noise close to the D3’s phenomenal quality?  Or would noise increase so quickly as to destroy any smoothing done by the resampling?  Basically, would the D3x fulfill the role of the D3 if pressed to work in high ISO situations.

Just to set the stage, I do a lot of studio model shooting and very often run into the D3’s low ISO limit of 200.  I would love an ISO 100 or 50.  Resolution is important, but so are smooth skin tonal transitions.  On the other hand, I do available light event shooting as well, and high ISO performance is also important.  So I need a combination of a D3 and D3x.

I set up a very basic test bed: a 70-180 macro lens, chosen mainly so that I could minimize the amount of room my test Christmas scene took; as well, its tripod mount allowed me to equalize the focusing and framing when I switched between bodies without having to dismount the body from the tripod, thereby reducing any sharpness issues related to focusing variations.  The macro lens was also stopped down to f/8 to get into the sweet spot for sharpness before diffraction set in; mirror-up and cable release were used.  The images were shot in 14-bit RAW and processed in Capture NX 2 with the same camera settings.  The D3x image was then downsampled using bicubic interpolation and a small amount of sharpening was used to match the D3 sharpening (both images show some haloes around high contrast areas).

The quick test images are below.

Sharpness
The resampled D3x images show a sharpness improvement over the D3 images.  This should come as no surprise as any softening effects of the AA filter and Bayer sensor layout and interpolation are going to be reduced by the resampling.  This is most evident in the text areas where there is high contrast between the background and text characters.  The D3 is showing softer text and in fact has noticeable moire in some of the characters (which means that the focusing was bang-on) while the D3x image clearly has crisper text.  At all ISOs up to 6400 (the maximum on the D3x), the D3x could resolve more detail in the text than the D3, despite similar noise characteristics.  The crispness is noticeable in the 100% view.  Resampling down is good!

Noise
Noise, or the lack thereof, was a surprising discovery.  The D3x produces extremely noise-free images at low ISOs.  At base ISOs (100-D3x, 200-D3), black areas are surprisingly smooth and devoid of noise, even slightly less than the D3.  It almost appears as though the black levels have been clipped in software.  However, a quick check against the D3 image shows that the same subtle shadow details are still there, so this is amazing sensor noise performance for the D3x.

When ISOs are increased, noise increases, of course.  Per-pixel noise at each camera’s native resolutions also seems to be pretty similar up to ISO 400 as well, after which the D3 slowly pulls ahead.  That’s not to say the D3x isn’t good, just that the D3 is so good.  Thankfully, noise appears more film grain-like as with other Nikons, and is less objectionable below about ISO 1600.  Above ISO 1600, the colour noise does pick up and details get lost as well.  Resampling results at 12MP show that up to ISO 800 or even 1600 the images generated by both cameras appear identical, but with the D3x having the edge on detail throughout.

There is an interesting characteristic I’ve noted on some high contrast areas — there is some sort of noiseless dark band or halo around objects at high ISOs (around the model lightbulb), almost some form of processing artifact as I see it appearing after around ISO 400, which rules out the lens being the cause.  I thought it was a D3x thing, but I see it on the D3 as well, so I assume it’s noise reduction.

Dynamic Range
I did not make a detailed DR analysis, other than to investigate highlight headroom on overexposure and subsequent recovery.  In that regard, I don’t see much difference below the ISO 800 or so mark — the D3x provides a good amount of latitude for what I would use it for.

Conclusions
The D3x is an interesting conundrum.  On the one hand, there’s the price.  On the other hand is arguably the most advanced 35mm DSLR today, with stellar image quality and resolution.  Where does it fit in?

The performance of the D3x is amazingly close to the D3.  Below ISO 800, there doesn’t appear to be much difference between the two cameras, with the D3x providing the extra resolution of course, so it would be the better choice.  ISO 1600 could go both ways, and above that is D3 territory of course.  Both cameras deliver extremely similar colour, a very useful characteristic when working with multiple cameras at the same time.

Now, if you resample, it buys you perhaps two stops, so an ISO 3200 image on both cameras at the same resolution appear pretty darned close.  The D3x can be pushed past its ISO 800/1600 comfortable full-resolution limit to its Hi-2 (ISO 6400) limit with good results this way.

So for anybody not needing the stratospheric 12,800 and 25,600 ISOs of the D3 and the high framerates, the D3x could just as easily be a D3 replacement.  It’s that good.  You won’t have to choose between high-ISO performance and high-resolution for just about any shooting situation — you can get it in one camera.  This is if you resample down to the D3’s resolution of course.  It’s almost cheating — the noise goes away and the sharpness increases when you resample.  Above ISO 1600, one gets into uncalibrated ISO land and colour suffers, just as it does on the corresponding Hi-1 and Hi-2 settings on the D3.  But it’s not as bad as you’d expect, manifesting more in increased noise in my case.  Details are still holding up extremely well, especially those not in deep shadows, where the noise lives. 

For studio work, the extra usable resolution coupled with the one lower stop of ISO makes the D3x a phenomenal camera.  The sharpness and clarity are real and the noise performance is outstanding.  There is no way upsampling could replicate the extra resolution.  On the flipside, the extra resolution does make a difference when resampled down, so there is no downside to resolution in this case (other than a little sluggishness when reviewing the massive images).  Although the camera slows to 1.8fps in 14-bit mode, the vast majority of studio lights today are unlikely to recycle this fast anyway, so the limiting factor is not the camera.  Landscape shooters should have no problem.

Personally, the D3x is much more tempting than I expected.  I was prepared to reject the high resolution because I felt I would be give up too much high-ISO capabilities.  Since I’ve had a chance to test the D3x, I’m very impressed and intrigued by the both its raw image quality as well as the noise-free resampled image quality, the latter characteristic a key one for being a D3 replacement.  It certainly could fit the D3 replacement role, especially since I have a D700 as well if I need all-out high ISO performance.  The D3x’s studio performance is what’s most compelling — a noticeable jump in image quality while utilizing my investment in Nikkor lenses is what I’m looking forward to.  The studio / architecture / landscape role is the D3x’s niche.

It all comes down to value.  Is it worth nearly $4000 over the D3?  That’s a terribly tough call.  If the premium were a mere $2000, say, over the D3, I think it would be a done deal.  As such, it’s just so tantalizingly worthwhile an upgrade, yet just that little bit out of the comfort zone.

What remains is to compare this with a medium format camera (hopefully a not-so-far-in-the future test!).