Shooting for the Stars. An Astrophotography Primer

I thought I’d stay on the photography theme a little while longer and touch on a subject I have been playing with off and on for a while now, and that is astrophotography.

I first approached this at the beginning of the millennium when I still lived on the Northern Beaches of Sydney. I had interviewed someone who as it turned out lived close by and was somewhat famous in this area of photography and videography, Steve Massey.

It Started With a Telescope

This got me all fired up and so I went out and spent serious money on a telescope and a film based Minolta SLR camera, and spent many happy hours shooting images and video of the Moon primarily.

When I moved back to Western Australia and down to the deep forests of the south-west of Western Australia, I upgraded the telescope and also abandoned the Minolta in favour of a Canon 5DS dSLR.

The absolute lack of ambient (and therefore interfering) light was offset by the amount of cloud we used to get, so the actual telescope time was minimal. And then a particularly ridiculous accident I want go into knocked the telescope and its tripod over rendering the focussing mechanism impossible to use.

Shortly after I moved back to civilisation 200Km south of Perth, and with COVID hitting decided to try and revive my damaged telescope.

The spare parts were available, but all things being what they were it took nearly 9 months to actually get them here!

In the interim, I discovered a little gizmo called the MSM, or “Move-Shoot-Move”.

And its brilliant.


A little bit of science is needed here to fully explain what the MSM does and why.

When you first get a telescope, you suddenly became Aacutely aware that the Earth moves through space, and pretty damn quickly at that. It first struck home when I finally managed to get an image of the Moon in the ‘scope that was nice and sharp. I had to go inside for something another – only gone a minute or two – and when I came back out, my image had gone!

Of course, the Earth is ripping through space at something like 500 metres per second or around 1600Km / hour at the equator. Hence the image of the Moon is moving across the field of view of the telescope at the same rate, so you get about a minute depending on the magnification of the telescope.

If you are lucky enough to be able to get, say Jupiter or Saturn in sight AND focus, then you have mere seconds.

But to get decent still shots, which require lots of light, you need more than this, so there is the dilemma.

You can buy mechanisms for telescopes to follow the earth’s rotation and also lock in on celestial objects, but this tends to get very expensive for the hobbyist

So, enter the Move-Shoot-Move (MSM).


msm-polar-aligned-side-v1-2The MSM is a small black box that mounts onto a tripod. There, that was easy.

But there is a lot more to it than that of course. You see, once it is charged up, and via various mounts, a camera attached – either DSLR or Mirrorless – the inbuilt motor rotates so that when you have locked onto a subject in the night sky, it will always stay in place as the camera rotates with the Earth.

You set the camera using either its inbuilt intervalometer or an add on one and set the aperture and ISO accordingly. If all goes well, you get shots like these.


An intervalometer is either an inbuilt function of the camera – and many have it – to tell the shutter to stay open for a specific period of time, beyond the normal 1/500th or 1/20th sec for example. To get shots like shown here, shutter times of up to 10 minutes or more are used.

The smart ones can also be set for multiple shots that are timed and other functions.

2022-12-14_16-26-48If your camera does not possess an internal intervalometer, go to your favourite camera store and ask for an external one that suits your make / model. An example of one I can recommend is the Hahnel Captur Timer Kit from  Leederville Cameras.

And while you can fluke it and get a great shot with a single image, those that are REALLY good at this stuff take many, many images of the subject in order to get as much data as possible, and then using specialist software, much of it free, “stack” these together to create a single composite image.

Polar South

Of course, there is a catch, sort of. You’ll recall when I stated the Earth’s movement rate, I was careful to clarify that this speed is “at the equator”. The Earth rotates at different rates depending on where you are, and so the MSM needs to be calibrated in order to get the exact setting.

In the Northern hemisphere this is relatively easy as they have a celestial body in the sky (where else I suppose?) called the Pole Star which to all intents and purposes is based exactly at True North. By calibrating the MSM, using a laser scope that comes with the system, to the Pole Star, you are good to go.

In the Southern Hemisphere we don’t have that luxury, and while there are ways to do this with methods using other stars, these are relatively complicated. So, there is a far better way, and it has added bonuses too.


2022-12-14_16-24-14I have mentioned the PhotoPills app before in stories, in order to calculate sun and moon rise times and locations in order to get the right positioning and timing to get specific shots.

But another piece of magic PhotoPills does is let you align the MSM quickly and easily to correctly set it for shooting deep space shots and stars. A combination of the inbuilt compass and a virtual reality overlay, with your smartphone attached via a mount to your MSM, lets you align perfectly to Polar South by simply lining up cross hairs to a central target.

With that done, you can then mount your camera, adjust the appropriate settings for aperture, ISO and the intervalometer and you are good to go.

In theory.

Final Tips

2022-12-15_15-21-30Of course, to get the perfect shot takes lots of practice and patience. I’d recommend a few things to make life easier.

  1. Initially don’t be too ambitious. Just get some shots to get a feel for what you are doing and learn what settings may be best. And make notes, or better, shoot RAW so the camera settings are embedded into the meta of each image
  2. To learn where planets, stars, constellations, asteroids, meteor showers and other stuff up there are, download a copy of the free program Stellarium for your PC, Mac, tablet whatever. It is absolutely bulging with information and can also create virtual skies based on locations and times.
  3. Get yourself a headband light that has the red-light option. This way, you’ll be able to see what you are doing but not stuff up your night sight.
  4. Use a decent tripod. The one thing you do not want to happen is for your camera to move in any way at all. I use a Miller Solo75 and can highly recommend it.
  5. The MSM is rated to a specific weight so this limits the lens you can use. Even my Canon 5DS with an 80-200mm is too heavy, so these days I use a Fujifilm X-T20 with a 16mm f/2.8 which is pretty close to what it appears the experts in the field use. But even if you have a base camera with a 28mm or something similar, you can still get some breathtaking shots.
  6. Apart from no camera movement (apart from that given by the MSM of course) the other thing that is imperative is focus. You must have your subject in absolutely pinpoint focus. Some cameras allow you to zoom into the image on the LCD for focussing, so if you have this use it. Otherwise focus to infinity but pull it back just a fraction. Some people place a piece of tape to lock the lens in place once they have that sweet spot worked out.
  7. Learn your camera. Shooting stars and planets etc is NOT the place for “A” for “Automatic!”
  8. Keep away from as much external light splatter as you can. The darker you can get it the better. Avoid streetlights, light from windows, car headlights and even the light of the Moon as much as possible.
  9. Look at as many YouTube tutorials, read as many online articles and so on as you can. There is always something to learn. There are some great tutorials on the MSM web site as a starting point, and you’ll also find some really good YouTube channels you’ll like. I started with this one.
  10. Above all be patient. Hopefully you’ll jag a great shot within your second or third attempts, but if you haven’t, just keep trying as when you do, it’s worth the wait and effort trust me!

Tutorial: The Difference Between A Ring Light and Key Light

I’ve had several people ask me when to use a key light and when to use a ring light.

Let’s start answering that by first defining what each is.

A ring light as the name suggests, is a circular light that either fits around a camera’s lens or is large enough for the camera to shoot through it.

A key light on the other hand is the main source of light in a scene (as distinct from a fill light that is designed to “fill in” any shadows created by the key light).

So, by the perfect definition, a ring light COULD be a key light just to confuse things. However, ring lights are designed for a very specific purpose, and that is to cast an even light onto your subject thus reducing shadows in the face and illuminating the eyes. As a side benefit it will also minimise any blemishes.

This is why ring lights are often used in portrait photography and glamour shots.

But they are also useful in imaging a single subject and why they are often used in shots of food for example, and as you may have seen if you are fan of forensic type shows on TV, the police love to use them at crime scenes too.

From a vlogger point of view, a ring light is perfect if used in a dark room. The one I reviewed a while back, the Elgato Ring Light has a mount inside the ring for a camera and so by placing it behind your monitor, if you are streaming gaming, or creating tutorials, everything looks very natural.

Under the same circumstances, a key light would most likely need a fill light and a back light to add a three dimension look to the image otherwise it runs the risk of being “shadowy”, unless of course you want that sort of look to emphasize the drama of a situation.

This is also often used a crime and spy shows, for example, when the subject is “hiding in the shadows” so to speak. This by the way is commonly called “split lighting”.

Lighting in photography and video is a science to itself, which is why in bigger productions there is a separate Lighting Director. If this interests you, there are a million on-line tutorials available and they best way to start is with basic 3-point lighting – key, fill and backlight – and experiment from there.

By the way, we have an Elgato Ring Light to give away valued at AUD$329. You’ll notice I have a pop up letting you subscribe for FREE to the Australian Videocamera e-magazine (all I need is an email address). For the next month, if you subscribe, you’ll go into the draw to win this Elgato Ring Light.

And if you are already a subscriber, don;t worry as you’ll be entered automatically.

Review: Elgato Key Light

One thing all cameras need, whether they be video, still, smartphone, dashcam, action camera or pinhole even, is light. That’s what allows the image to be created.

And how that light can be setup, manipulated, coloured and so on has been the content of probably millions of articles and thousands of books and even probably more than that in lectures.

But if the source of the natural light available is not playing the game – too dark, too shadowy, or even too bright – we need ways to tamper it, tame it, and even substitute it.

In the science of lighting, the most basic is what is known as 3-point lighting and those three points are called key, fill and back. By placing these three light sources at strategic points around the subject, each illuminates a separate dimension of the subject to create a greater representation of height, width, and depth in the resultant imagery.

This diagram shows the most basic lighting setup.

Elgato has released a specific light designed to act as a key light and whilst it is aimed it seems, specifically at vloggers, it can be used in many other locations as well, and has some rather nice features.

In concert with its sibling, the Elgato Ring Light which I reviewed back in September last year the Elgato Key light supports Wi-fi letting you control its settings right from the desktop of your PC or Mac.

Using the Elgato Control Centre app, you can turn the light on or off, change the brightness, the colour temperature or even have it synched with other lights. Featuring a 2800 lumen output from 160 LEDs, you can change the brightness from a very bright right down to a very subtle glow, through color temperatures ranging from 2900 to 7000K.

In my testing the diffusion was utterly glare free and I could see no hot spots or other deviation.

A major advantage of using LEDs of course is the heat generated factor is minimal and power usage is much less than standard lighting systems.

To set the Elgato Key Light up is simplicity itself. The power source is a 240v adaptor that thankfully has a long lead (about 3 meters) that plugs into the LED panel. This in turn is screwed onto a vertical extendable arm. The connection point is a ball joint allowing large freedom in choosing angles to point the light. The actual screw fitting is a standard ¼” thread so the Elgato Key Light can actually be mounted on any tripod or bracket supporting this size (or with an adaptor of course if that is needed).

The base is a familiar Elgato clamp style for attaching to a desk or benchtop.

To connect the Elgato Key Light to your Wi-fi network you press the rocker switch to the right and hold for around 10 seconds until the light flashes, then switch it the left and use the Elgato Control Panel to detect and connect.

The only issue I found is a common Wi-fi based peripheral problem; as the IP address is used, if you are using non-fixed IP addresses in your network which is common, if you reset the modem / router, it may allocate another IP address so you’ll have to reconnect.

The Elgato Key Light is available for around $329.There is a Key Light Air available as well which I have not seen, and that retails for around $219.

You can get more information on the Elgato Key Light at the Elgato website.




Roger Waters. Comfortably Numb 2022. A Very Different Take Using CGI

I am an unashamed Pink Floyd fan, and also love Roger Waters (who left the band bloody yonks ago).  I had an article a few weeks back about the Pulse video and especially the track Comfortably Numb, which I think I described as a masterpiece in video and visual production overall.

Well, Roger Waters has just released a solo version of it called Comfortably Numb 2022. It is described as a “Dark”: version. It is that, I have to say; and with no guitar solo which was the highlight of previous versions. But it is extremely haunting for other reasons, not the least being the brilliant use of simply almost monochrome CGI panning over a landscape.

You can read into this whatever conflict you will – or all conflicts for that matter – but it does use music and video to project intensly powerful emotions I think, and is a masterclass in video / music creation of simplicity not needing more than careful thought of what exactly you are trying to portray or say.

See for yourself.


Shooting the Total Eclipse: A Video Tutorial

Just after I wrote my piece on shooting tomorrow’s eclipse, the folk at PhotoPills released a step-by-step video on how to shoot it.

  • How to plan a total lunar eclipse.
  • All the equipment you need to photograph the eclipse.
  • The camera settings you need.
  • Where to focus.
  • And how to photograph the total lunar eclipse step by step.
  • How to plan a total lunar eclipse.
  • All the equipment you need to photograph the eclipse.
  • The camera settings you need.
  • Where to focus.
  • And how to photograph the total lunar eclipse step by step.

Photographing / Videoing the Total Eclipse Tomorrow. How to get that Killer Shot!

You may have heard that tomorrow night (Tuesday 8th Nov) in Australia we’ll have the pleasure of a total lunar eclipse. This happens when the Earth gets between the Sun and the Moon, thus cutting of most of the light that we normally see reflect off the Moon’s surface. What happens is that only a bit of the red wavelength manages to sneak through, and so we get the famed “blood Moon”.

Many people will have their cameras, camcorders and smartphones primed for the event, but the trick is knowing exactly when it will happen. If at all.

For example, here in the SW of Western Australia, it will be between 5pm and 7:30pm, and the sun won’t have even set, so we’ll see, well, bugger all.

Conversely, in Hobart, Taswegians will have the spectacle between 8pm-ish and nearly midnight so will have a ripper show!

So, how do you calculate exactly what time you can get that perfect photo or video, no matter you live in Dubbo or downtown Cooper Pedy?

I first started to (again) dabble in photo and video astronomy just before the pandemic, and over the ensuing period I found a brilliant app called PhotoPills that is sort of a Swiss Army knife for all things astro.

It will calculate for you the exact time of celestial events (sunrise, sunset, moonrise, planet rise, meteor showers etc) for any known location. So, say you want to get the sunrise coming up over the lighthouse at Barrenjoey Headland or Byron Bay, it will work out EXACTLY where you need to be location wise, and at what time to be there.

For camera alignment to get star trails, or aligning a tracking device like the MSM, it will quickly allow you to set up the equipment, so you don’t get errors with streaking and so on.

It has a whole bunch of other stuff too such as aids in working out depth of field, field of view, hyperfocal tables, exposure and much, much more.

Back to the lunar eclipse planning though …

To work this out for your location, there are a number of easy steps to follow using PhotoPills.

  1. Open up the app (Android or iOS) and choose Planner
  2. You may be asked whether you want Precise or Approximate locations to be calculated. Use precise
  3. A Google Earth image will open with a series of coloured lines intersecting at your location and a stack of statistical and other info at the top and bottom.
  4. At the bottom right-hand corner of the map is a Plus sign and an icon showing two diamond shapes. This latter is the Map Settings button. Tap this to open and then under Map Layers, choose Eclipse. This will open the Solar and Lunar eclipses calendar.
  5. Find the date (8/11/2022) and tap on that and then return to the map. This will now be set at the exact date / time of the eclipse.
  6. You can now zoom the map out and see all the eclipse information on the map.
  7. Next, swipe the top panel (above the map) to the left until you see the Eclipse data.
  8. The lines on the map show the stages of the eclipse of the Moon as it travels around the Earth. As you can see there are areas where they will not see the eclipse at all (African continent, Alaska etc).
  9. Let’s assume you want to see the times and other information on the eclipse from a location not where you are. To do this, tap the load button and enter the location. I’ve chosen Port Hedland. The map is now reset with all the Lunar eclipse info set to Port Hedland.
  10. You’ll notice that the lines on the map have different labels – P1, U1, U2, U3 and so on. These relate to the phases of the eclipse – the Penumbral and Umbral. The data above the map matches these phases as the Moon moves through the eclipse stages. So in the case of Port Hedland, the eclipse starts at around 3:10pm when the Moon starts to enter the Earth’s shadow, total eclipse is at 7:00pm exactly and it’s all over at 9:57pm.
  11. Both the top panel and bottom panel can be manipulated by taps and holds to see the eclipse info at different times.
  12. The thin blue line next to the thicker one shows the actual path of the Moon.

All the steps above now give you all the data you need to actually plan the shot / video.

Now here comes the clever part of PhotoPills. Or one of them anyway.

Screenshot_20221107-131939_PhotoPillsThe app has a built in Augmented Reality system (the AR button down the bottom), Tap that, and the data will overlay the scene shown in your device’s camera and show you EXACTLY where the Moon will be in the sky. You have available the horizon and a compass plus the position (height)n of the Moon letting you set the camera up precisely for any required shot.

Tie all this information together, and you can now plan your shoot precisely. Say for example you want to catch the total eclipse over the top of the Sydney Opera House, or Ayres Rock – sorry Uluru – then you know exactly where you need to be, at what time and the position of the camera.

It does take a little experimentation and playing with PhotoPills. It is one of those programs whose depths you may never plumb to the fullest. But in the process, you will certainly have the ability and wherewithal’s to get those killer photos you see in the likes of national geographic etc.

At the PhotoPills website, there is a whole bunch of tutorials to walk you through various scenarios too and to get full use of the app. They also have a free e-Book you can download.

Tip: A great skill to learn for this sort of stuff is compositing in Adobe Photoshop. This image by Jose A Hervas was created using that functionality.

Photopills example

PhotoPills costs $9.99 (bargain!)  and is available from Google Play and the App Store.

When was the last time you checked?

We all fastidiously look after our equipment making sure it is kept dry and clean, lenses checked for dust spots, tripod mechanisms kept oiled or greased, drones regularly checked for any fractures or loose props and gimbals locked and put away after use and cleaning.

Batteries are kept in good condition, SD cards and portable drives regularly checked for errors and reformatted – the list goes on and on.

But I wonder how many people as religiously check that all hardware updates are kept up to date?

This morning I decided to do a quick audit of my gear and found that in total I had no less than 27 devices that used a firmware update system in some form or another, and 12 of these were out of date.

Upon checking further, some of these updates were declared as “critical” by the manufacturer…

So there goes my afternoon!

Next is to check what software updates I might need to employ on my 3 PCs.

Mini Tutorial: Blackmagic ATEM Mini Pro -Getting Started

The tutorials for the Blackmagic Design ATEM Mini Pro I have placed on the website over the last 2 years are by far and away the most popular stories of the ones that are there (in the thousands these days).

But it suddenly struck me that although I have done tutorials for creating lower 3rds, using the up and downstream keyers and previewing and switching, I have not done one on the physical setup of the unit.

In truth it is not that hard; there are a couple of things you can do though that make the Blackmagic Design ATEM Mini Pro an even better tool.

Before you start, the first thing I recommend is to download the free Blackmagic ATEM Software Control program. You can get that here.

Once everything is setup, I’ll then explain why this is such a good add on.

There is another thing that is very useful I have found, but you need a 3D printer for this. It is a set of brackets that lift the Blackmagic Design ATEM Mini Pro off the desk and angle it forwards slightly. This has two benefits; firstly it makes the physical operation a bit easier, with the labels on the various buttons easier to read as you become familiar with the Mini Pro, and secondly, it helps with the cooling of the unit.

You can get the ZIP file containing the printable 3D files here.

Setting Up

The first thing you need to do is apply power to the Mini Pro. Blackmagic supplies a special power supply for this with a screw in connector, so do not lose it as you can’t go to a simple electronics store to get a replacement – you have to get it from Blackmagic.

Note: It is good practice to have the power OFF when attaching all the relevant cables such as HDMI, Ethernet etc.

Next, you want to add in you HDMI cables connecting your cameras. I run a series of GoPros in inputs 2 and 3, leaving input 4 for my Blackmagic Pocket Cinema Camera 6K Pro. Input 1 is for the image coming from my computer.

Why would I do this?

This allows me to capture the steps taken when creating tutorials for software showing mouse clicks, keyboard strokes and so on.

To explain this further I need to digress a fraction and explain my PC setup.

I run a pretty bog-standard Dell tower with one monitor on HDMI and the other on DVI. To get the feed from the computer to the Blackmagic Design ATEM Mini Pro, I have an HDMI splitter connected to the cable and one of these runs to my second monitor and the other to HDMI 1 in on the ATEM.

I then run an HDMI cable from the HDMI out of the Blackmagic Design ATEM Mini Pro to the 3rd monitor, an OSEE LCM215E that I love.

The next thing to do is run an ethernet cable if you want to do direct live streaming using Facebook, YouTube etc using the Blackmagic Design ATEM Mini Pro with the Blackmagic ATEM Software Control program (see below)

For audio, the Blackmagic Design ATEM Mini Pro has a pair of standard 3.5mm inputs. How you use these depends on your needs; some people have two mics connected in order to do guest interviews. Others have the first for vocals and the second for musical input via a media player which is the way I have it connected up.

I have tested many, many mics before I found the one best suited to my environment, a Sennheiser MKE600. As this mic uses an XLR (also know as a cannon) connector, you need to get hold of an adaptor to switch it to 3.5mm, but a good electronics store will have these available for a reasonable price.

The final thing you can connect is an external hard drive to record sessions for later playback. This connects to the built in USB-C port.

Again I have tested many devices in this area and have standardised on a particular unit for a specific reason.

As the Blackmagic Pocket Cinema Camera 6K Pro can also record direct to an external hard drive, I have gone with one of the recommended drives from Blackmagic which is a Samsung T5 This is available in either a 250GB or 2TB configuration. This means I can double up their usage between the Blackmagic Design ATEM Mini Pro and the Blackmagic Pocket Cinema Camera 6K Pro.

If you choose not to add an external hard drive, the USB-C port can instead be connected to your computer which will then ‘see’ the whole setup as a pseudo web cam letting you also live stream, just not with the anywhere near the same measure of control and flexibility that the Blackmagic ATEM Software Control program connectivity will give.

Blackmagic ATEM Software Control program.

Once you have the basic unit setup, then you can easily configure your streaming of software of choice – Facebook, YouTube etc – to take the input directly from the Blackmagic Design ATEM Mini Pro.

But adding the Blackmagic ATEM Software Control program into the mix expands this capability greatly.

To quote from the manual, “For example, you can manually perform transitions using the fader bar, select internal sources on the program and preview buttons, mix audio using a mixer with channel faders, set up keyers, load graphics in the media pool and much, much more.”

This takes all of the heavy lifting off the PC and any software. But as they say, wait, there is more!

You can set up the ATEM switcher settings in one of two ways, cut bus (the default) or program preview, and these settings are changed in the Blackmagic ATEM Software Control program and fully detailed in the manual (which you can get here if you want a bit of light reading!).

In cut bus mode, as soon as you press the input button it immediately switches to being on air. Alternatively, in program preview mode, it becomes a two-step process to switch sources.

Pressing an input button puts that source into a special preview mode. This means you have the chance to change your mind and select another source if you want and this is the common way of switching in the professional broadcast world.

I use the second method, hence my adding the third monitor which gets the preview signal via the HDMI out on the Blackmagic Design ATEM Mini Pro.


As I have stated a couple of times in previous stories, back in the 90s when I was writing and directing the suite of videos that became the definitive tutorials for Windows 95 and Office 95 (there were about 20 x 90 minute videos covering the lot), the equipment we needed to create these, capture on-screen graphics, switch in graphics, add text and so – let alone the camera to shoot this stuff – was worth tens of thousands of dollars.

Today, with something as simple as a smartphone, you can just about mimic what we did, with a Blackmagic Design ATEM Mini Pro that costs around $500. And do it live to an audience of hundreds, thousands or millions.

Its mind boggling and exciting at the same time.

Have fun!

Related Stories







With its stabilisation can a GoPro benefit from a gimbal? Absolutely. Here’s why.

There was a question in one of the Facebook GoPro forums today. Someone asked which gimbal they should / could get that would suit a GoPro 10.

Various options were mentioned (my recommendation having tried it is the Zhiyun Crane M2S), but a couple of members scoffed at the thought.

Their reasoning was that the in-camera electronic stabilisation of the current GoPro is more than enough, and a gimbal would offer no advantages.

This is plainly not true if you think further than simple stabilisation.

A gimbal is also incredibly useful to get into tight places and still retain that stability. Think low down or high above your head shots, especially in a fast-moving environment. You get a much firmer “hold” of the camera than handheld will offer.

Another example is the “around the corner” type of shot where the camera is poked around a corner to reveal something. This is very hard to do handheld or even with a selfie stick and retain stability.

(I am guessing this is why Zhiyun use the “crane” moniker by the way as the gimbal is acting as a pseudo crane in many cases).

In low light, you retain stability even if the shutter speed has been slowed and the aperture opened, again giving better stability than in-camera will usually give.

Depending on the model, you can easily add other devices such as mics or lights, thus giving more flexibility (the M2S has a built-in video light complete with magnetic add on colour filters and other Zhiyun models even have a decent built-in mic).

So as you see, a gimbal can be a bit more than a glorified Selfie stick!

Tutorial: The Camera Does NOT Create the Image. Or Going Beyond “A” for Auto.

If you are new to shooting your new GoPro or DJI drone and want to learn how to get those fantastic shots you see from experienced usres, here is a starting point that is very easy to digest and get you up and running and on your way.

I write occasionally of recurring themes I see in the various newsgroups and social media sites I frequent, as well as in questions received from readers here at Auscam Central.

Another common one is “what camera do I need to … (fill in the blanks yourself?”

Let’s get one thing straight up front.

The camera does NOT create the photo.

What it does is record a moment in time – or moments over time in the case of video – according to the a) instructions given to it and b) using the available light for that purpose.

If all you are doing is recording a memory and have no desire to be in any way creative, by all means set the camera or camcorder on “A” for “Automatic” and simply press the button. People have been doing that since the camera obscura was invented by the ancient Greeks and Chinese – no-one seems to know who came first.

And then you can stop reading right now.

Basic Factors

But if you want creativity in your photography or video shoots, then you need to become familiar with a few things, and once understood, these will take you beyond the “happy snapper” level.

These factors apply no matter you are shooting with a smartphone, GoPro, mirrorless 4/3rds model, dSLR or even a cinema camera.

The basic rules I apply to my Blackmagic Pocket Cinema Camera 6K Pro are identical to those to be used on any other camera device.

They revolve around the 6 parameters of:

  • Light
  • Aperture
  • Focus
  • Shutter Speed
  • ISO
  • Composition


Firstly, and pretty obviously, if there is no light, there is no way you can capture an image. But as important as having light is how you use that light. My friend Peter Aitchison, one of Australia’s best photographers, once said to me photography is “painting with light”, a phrase that has stuck with me.

Have a look at any decent TV drama – I favour BBC productions, especially the period dramas it does so well – and watch how light is used to create a particular ambience, emotion or tension as well as simply lighting the set.

Lighting is a specific skill and is why there are separate people in TV or film industry who specialise only in lighting. In the credits, they are often called the Gaffer.

The properties of light you need to understand to get best effect include of course colour, but also colour temperature and strength.

I have a feature on the basics of lighting on the Australian Videocamera website at

Aperture, focus and shutter speed

These three are intertwined and changing one will affect the other.

The aperture on a camera (some call it an iris in video) dictates the amount of light reaching the sensor to record the image. By using the aperture settings you can make sure there is enough light to create the image (prevent under exposure) or lessen the amount of light to stop an image getting overexposed.

Aperture is measured in f stops – f2.8, f5.6, f8, f11, f16 and f22. The larger the number, the smaller the “hole” – the aperture- to let the light on to the sensor.

In conjunction with the lens and focal length of the lens, you can also dictate what is called the depth of field in an image. This can get complicated, but in essence it is what part of an image is in focus and what is not.

For example, using depth of field you can set the aperture in concert with the lens so the subject 2 metres from you is pin sharp focus, but the background is blurred. A form of this is the so called “bokeh” effect.

Note: Cameras such as GoPros and many drone units have a fixed aperture of usually around 2.8. This means you have to use alternate methods to get any effects / techniques obtained by changing the aperture. One way is to use ND filters and there is a tutorial on that here.

Focus might seem obvious but there is a skill involved in this too. Getting the subject and keeping it in focus might seem like a no-brainer, but in video especially, sometimes you want to switch between what is in focus to something else being in focus, so this is something that needs practising.

Shutter speed is how long the shutter stays open to catch the available light, and again, by controlling this you can get some interesting effects.

In sport, with a fast shutter speed you can freeze a race car but blur the background, or with a slow shutter speed, get that dreamy moving water look.

Image Courtesy Ross Gibb Photography

Shutter speed also plays a major role in creating slow motion, and there is a tutorial on that here.

I did a tutorial on depth of field, focus and shutter speed during one of my trips up north to Exmouth and you can read that here.


In the “old days” this was called ASA and referred to the “speed” of a piece of film stock. The speed was a reference to how it reacted to quantities – or lack of – light.

This still applies today, but of course in the digital world there is no film as such.

Modern day cameras have astonishing ISO ranges and what they do is increase or decrease the sensitivity to light. This means that in low light, cranking up the ISO assists in the exposure, but be aware, as ISO increases, so does the graininess of the image and this is caused by digital “noise”. Once again, while upping the ISO can compensate for low light, you cannot beat adding the real thing.


Composition refers to how the subject(s) in your image, and the image overall “looks”. The whole idea is to make an image that is either pleasant to the eye or assists in telling the story. Or both.

Of course, if the story demands it, a composition can also create a jarring image.

As a sideline, checking the composition of an image makes sure there are no glaring errors. A classic example of this was in the 1960s with the UK version of the TV show Robin Hood which of course was set in the Middle Ages.

The show’s titles started dramatically with Robin firing an arrow from his bow and the camera following it. Sadly, someone forgot to check the trajectory and consequently the arrow flew neatly past a long line of telegraph poles.

A common newbie mistake is finding you have a tree or other object growing out of someone’s head.

Composition is an art form to itself and most of it comes from watching others who have been doing it for years (think directors like Stanley Kubrick for example) and sheer trial and error.

There are some tools there to help you though and one basic one is the grid lines you can overlay on your viewfinder / LCD on many cameras.

The most common “rule” in composition is the Rule of Thirds, but this by no means has to be followed religiously every time. The aforementioned Mr Kubrick fastidiously ignored it, breaking every convention in the book by always having his subject at the dead centre of the camera frame.

At the very basic level, before pressing the shutter release you’ll get used to critically looking at how the image in the viewfinder or LCD looks, and scanning your eye over it to make sure there are no things such as shadows encroaching on the image, the horizon is straight, no foreign objects have snuck into frame and so on.

If you want a very good composition reference guide, Adobe has one here.


I have only scratched at the surface of these subjects, but hopefully have given you a starting point to investigate further and hone your skills to make a better and more creative photographer / videographer.