COP28 to be held shortly in the United Arab Emirates (UAE) will open with the acceptance that the current efforts to limit greenhouse gas (GHG) emissions are well short of targets to prevent the global temperature increasing beyond 1.5C. The UAE's COP28 president has highlighted that approx 22 gigatons of GHG emissions need to be cut over the next seven years. This means a reduction of 43% in emissions from the 2019 levels that would need to be achieved by 2030. These are staggering numbers and represent the continuing failure globally to make any measurable difference thus far.
The next Council of the Parties (COP 28) for the United Nations Framework Convention on Climate Change will be held in the United Arab Emirates between 30 November and 12 December 2023. As the conference numbering suggests, this is the 28th meeting of representatives from countries around the world yet with climate change now measurably underway, how effective have the previous 27 meetings been ? And will the 28th meeting achieve more substantial progress than its predecessors ?
With a conference theme and buzzword being "actionism", the Conference promises that the first global stocktake of the implementation of the Paris Agreement will conclude at COP28 with the aim of assessing the world's collective progress towards achieving its climate goals. It may be a disappointing exercise given current global environment trends.
Contamination of both the general environment and the human population from chemicals has long been a deep concern and for the most part, the issue has been one that can be controlled with the ''its-not-too-late" ethos. For this issue however, that ethos cannot apply.
The US Centers for Disease Control (CDC) provides the basic understanding of the scale of the problem that is now an irreversible contamination of world-wide significance.
The CDC states " The per-and polyfluoroalkyl substances (PFAS) are a group of chemicals used to make fluoropolymer coatings and products that resist heat, oil, stains, grease and water. Fluoropolymer coatings can be in a variery of products. These include clothing, furniture, adhesives, food packaging, heat-resistant non-stick cooking surfaces, and the insulation of electrical wire. Many PFAS, including perfluorooctanoic sulfonic acid (PFOS) and perfluorooctanoic acid (PFOA), are a concern because they:
do not break down in the environment
can move through soils and contaminate drinking water
build up (bioaccumulate) in fish and wildlife
PFAS are found in rivers and lakes and in many types of animals on land and water"
These days, we don’t think much about being able to access a course of antibiotics to head off an infection. But that wasn’t always the case – antibiotics have been available for less than a century.
Before that, patients would die of relatively trivial infections that became more serious. Some serious infections, such as those involving the heart valves, were inevitably fatal.
Other serious infections, such as tuberculosis, weren’t always fatal. Up to a half of people died within a year with the most severe forms, but some people recovered without treatment and the remainder had ongoing chronic infection that slowly ate away at the body over many years.
Once we had antibiotics, the outcomes for these infections were much better.
Life (and death) before antibiotics
You’ve probably heard of Alexander Fleming’s accidental discovery of penicillin, when fungal spores landed on a plate with bacteria left over a long weekend in 1928.
But the first patient to receive penicillin was an instructive example of the impact of treatment.
In 1941, Constable Albert Alexander had a scratch on his face that had become infected.
He was hospitalised but despite various treatments, the infection progressed to involve his head. This required removing one of his eyes.
Howard Florey, the Australian pharmacologist then working in Oxford, was concerned penicillin could be toxic in humans. Therefore, he felt it was only ethical to give this new drug to a patient in a desperate condition.
Constable Alexander was given the available dose of penicillin. Within the first day, his condition had started to improve.
But back then, penicillin was difficult to produce. One way of extending the limited supply was to “recycle” penicillin that was excreted in the patient’s urine. Despite this, supplies ran out by the fifth day of Alexander’s treatment.
Without further treatment, the infection again took hold. Constable Alexander eventually died a month later.
We now face a world where we are potentially running out of antibiotics – not because of difficulties manufacturing them, but because they’re losing their effectiveness.
What do we use antibiotics for?
We currently use antibiotics in humans and animals for a variety of reasons. Antibiotics reduce the duration of illness and the chance of death from infection. They also prevent infections in people who are at high risk, such as patients undergoing surgery and those with weakened immune systems.
But antibiotics aren’t always used appropriately. Studies consistently show a dose or two will adequately prevent infections after surgery, but antibiotics are often continued for several days unnecessarily. And sometimes we use the wrong type of antibiotic.
Surveys have found 22% of antimicrobial use in hospitals is inappropriate.
In some situations, this is understandable. Infections in different body sites are usually due to different types of bacteria. When the diagnosis isn’t certain, we often err on the side of caution by giving broad spectrum antibiotics to make sure we have active treatments for all possible infections, until further information becomes available.
In other situations, there is a degree of inertia. If the patient is improving, doctors tend to simply continue the same treatment, rather than change to more appropriate choice.
In general practice, the issue of diagnostic uncertainty and therapeutic inertia are often magnified. Patients who recover after starting antibiotics don’t usually require tests or come back for review, so there is no easy way of knowing if the antibiotic was actually required.
Antibiotic prescribing can be more complex again if patients are expecting “a pill for every ill”. While doctors are generally good at educating patients when antibiotics are not likely to work (for example, for viral infections), without confirmatory tests there can always be a lingering doubt in the minds of both doctors and patients. Or sometimes the patient goes elsewhere to find a prescription.
For other infections, resistance can develop if treatments aren’t given for long enough. This is particularly the case for tuberculosis, caused by a slow growing bacterium that requires a particularly long course of antibiotics to cure.
As in humans, antibiotics are also used to prevent and treat infections in animals. However, a proportion of antibiotics are used for growth promotion. In Australia, an estimated 60% of antibiotics were used in animals between 2005-2010, despite growth-promotion being phased out.
Why is overuse a problem?
Bacteria become resistant to the effect of antibiotics through natural selection – those that survive exposure to antibiotics are the strains that have a mechanism to evade their effects.
For example, antibiotics are sometimes given to prevent recurrent urinary tract infections, but a consequence, any infection that does develop tends to be with resistant bacteria.
When resistance to the commonly used first-line antibiotics occurs, we often need to reach deeper into the bag to find other effective treatments.
Some of these last-line antibiotics are those that had been superseded because they had serious side effects or couldn’t be given conveniently as tablets.
New drugs for some bacteria have been developed, but many are much more expensive than older ones.
Treating antibiotics as a valuable resource
The concept of antibiotics as a valuable resource has led to the concept of “antimicrobial stewardship”, with programs to promote the responsible use of antibiotics. It’s a similar concept to environmental stewardship to prevent climate change and environmental degradation.
Antibiotics are a rare class of medication where treatment of one patient can potentially affect the outcome of other patients, through the transmission of antibiotic resistant bacteria. Therefore, like efforts to combat climate change, antibiotic stewardship relies on changing individual actions to benefit the broader community.
Like climate change, antibiotic resistance is a complex problem when seen in a broader context. Studies have linked resistance to the values and priorities of governments such as corruption and infrastructure, including the availability of electricity and public services. This highlights that there are broader “causes of the causes”, such as public spending on sanitation and health care.
Other studies have suggested individuals need to be considered within the broader social and institutional influences on prescribing behaviour. Like all human behaviour, antibiotic prescribing is complicated, and factors like what doctors feel is “normal” prescribing, whether junior staff feel they can challenge senior doctors, and even their political views may be important.
There are also issues with the economic model for developing new antibiotics. When a new antibiotic is first approved for use, the first reaction for prescribers is not to use it, whether to ensure it retains its effectiveness or because it is often very expensive.
However, this doesn’t really encourage the development of new antibiotics, particularly when pharma research and development budgets can easily be diverted to developing drugs for conditions patients take for years, rather than a few days.
The slow moving pandemic of resistance
If we fail to act, we are looking at an almost unthinkable scenario where antibiotics no longer work and we are cast back into the dark ages of medicine
– David Cameron, former UK Prime Minister
Antibiotic resistance is already a problem. Almost all infectious diseases physicians have had the dreaded call about patients with infections that were essentially untreatable, or where they had to scramble to find supplies of long-forgotten last-line antibiotics.
There are already hospitals in some parts of the world that have had to carefully consider whether it’s still viable to treat cancers, because of the high risk of infections with antibiotic-resistant bacteria.
A global study estimated that in 2019, almost 5 million deaths occurred with an infection involving antibiotic-resistant bacteria. Some 1.3 million would not have occurred if the bacteria were not resistant.
The UK’s 2014 O'Neill report predicted deaths from antimicrobial resistance could rise to 10 million deaths each year, and cost 2-3.5% of global GDP, by 2050 based on trends at that time.
What can we do about it?
There is a lot we can do to prevent antibiotic resistance. We can:
raiseawareness that many infections will get better by themselves, and don’t necessarily need antibiotics
use the antibiotics we have more appropriately and for as short a time as possible, supported by co-ordinated clinical and public policy, and nationaloversight
monitor for infections due to resistant bacterial to inform control policies
reduce the inappropriate use of antibiotics in animals, such as growth promotion
reduce cross-transmission of resistant organisms in hospitals and in the community
prevent infections by other means, such as clean water, sanitation, hygiene and vaccines
continue developing new antibiotics and alternatives to antibiotics and ensure the right incentives are in place to encourage a continuous pipeline of new drugs.
Read the other articles in The Conversation’s series on the dangers of antibiotic resistance here.
Reports this week suggest a near-collision between an Australian satellite and a suspected Chinese military satellite.
Meanwhile, earlier this month, the US government issued the first ever space junk fine. The Federal Communications Commission handed a US$150,000 penalty to the DISH Network, a publicly traded company providing satellite TV services.
It came as a surprise to many in the space industry, as the fine didn’t relate to any recent debris – it was issued for a communications satellite that has been in space for more than 21 years. It was EchoStar-7, which failed to meet the orbit requirements outlined in a previously agreed debris mitigation plan.
The EchoStar-7 fine might be a US first, but it probably won’t be the last. We are entering an unprecedented era of space use and can expect the number of active satellites in space to increase by 700% by the end of the decade.
As our local space gets more crowded, keeping an eye on tens of thousands of satellites and bits of space junk will only become more important. So researchers have a new field for this: space domain awareness.
Three types of orbit, plus junk
Humans have been launching satellites into space since 1957 and in the past 66 years have become rather good at it. There are currently more than 8,700 active satellites in various orbits around Earth.
Satellites tend to be in three main orbits, and understanding these is key to understanding the complex nature of space debris.
The most common orbit for satellites is low Earth orbit, with at least 5,900 active satellites. Objects in low Earth orbit tend to reside up to 1,000km above Earth’s surface and are constantly on the move. The International Space Station is an example of a low Earth orbit object, travelling around Earth 16 times every day.
Higher up is the medium Earth orbit, where satellites sit between 10,000 and 20,000km above Earth. It’s not a particularly busy place, but is home to some of the most important satellites ever launched – they provide us with the global positioning system or GPS.
Finally, we have very high altitude satellites in geosynchronous orbit. In this orbit, satellites are upwards of 35,000km above Earth, in orbits that match the rate of Earth’s rotation. One special type of this orbit is a geostationary Earth orbit. It lies on the same plane as Earth’s equator, making the satellites appear stationary from the ground.
As you can tell, Earth’s surrounds are buzzing with satellite activity. It only gets more chaotic when we factor in space junk, defined as disused artificial debris in orbit around Earth.
Space junk can range from entire satellites that are no longer in use or working, down to millimetre-wide bits of spacecraft and launch vehicles left in orbit. Latest estimates suggest there are more than 130 million pieces of space debris, with only 35,000 of those large enough (greater than 10cm) to be routinely tracked from the ground.
How do we track them all?
This is where space domain awareness comes in. It is the field of detecting, tracking and monitoring objects in Earth’s orbit, including active satellites and space debris.
We do much of this with ground-based tracking, either through radar or optical systems like telescopes. While radar can easily track objects in low Earth orbit, higher up we need optical sensors. Objects in medium Earth orbit and geostationary orbit can be tracked using sunlight reflected towards Earth.
For reliable and continuous space domain awareness, we need multiple sensors contributing to this around the globe.
Below you can see what high-altitude satellites can look like to telescopes on Earth, appearing to stay still as the stars move by.
Australia’s role in space awareness
Thanks to our position on Earth, Australia has a unique opportunity to contribute to space domain awareness. The US already houses several facilities on the west coast of Australia as part of the Space Surveillance Network. That’s because on the west coast, telescopes can work in dark night skies with minimal light pollution from large cities.
Furthermore, we are currently working on a space domain awareness technology demonstrator (a proof of concept), funded by SmartSat CRC. This is a government-funded consortium of universities and other research organisations, along with industry partners such as the IT firm CGI.
We are combining our expertise in observational astrophysics, advanced data visualisation, artificial intelligence and space weather. Our goal is to have technology that understands what is happening in space minute-by-minute. Then, we can line up follow-up observations and monitor the objects in orbit. Our team is currently working on geosynchronous orbit objects, which includes active and inactive satellites.
EchoStar-7 was just one example of the fate of a retired spacecraft – the FCC is sending a strong warning to all other companies to ensure their debris mitigation plans are met.
Inactive objects in orbit could pose a collision risk to each other, leading to a rapid increase in space debris. If we want to use Earth’s space domain for as long as possible, we need to keep it safe for all.
Acknowledgment: The authors would like to thank Sholto Forbes-Spyratos, military space lead at CGI Space, Defence and Intelligence Australia, for his contribution to this article.
If we’re upset about the price of petrol, why do we drive the vehicles we do?
SUVs (so-called sport utility vehicles) use more fuel per kilometre than standard cars – according to the International Energy Agency, up to 25% more.
They weigh more than standard cars – about 100 kilograms more.
And they emit more carbon than standard cars. In Australia, medium-size SUVs emit 14% more carbon per kilometre travelled than medium-size cars. Large SUVs emit 30% more than large cars.
Yet we’re buying them at a rate that would have been unimaginable even a decade ago.
SUVs outsell passenger cars 3 to 1
As recently as 2012, more than half the new vehicles sold in Australia were “passenger cars” – the standard low-slung cars of the type we were used to. About one-quarter were SUVs.
Back further, in the early 1990s, three-quarters of the new vehicles we bought were passenger cars, and only 8% SUVs.
Yet after an explosion in SUV sales, today every second vehicle bought is a SUV. In September, SUVs accounted for 58% of new vehicle sales. Passenger cars accounted for just 17%. This means SUVs outsell passenger cars three to one.
Like country music, SUVs are hard to define, but you know one when you see one.
Standard passenger cars (be they hatches, sedans or wagons) sit closer to the ground, are usually lighter, and are less likely to kill or seriously injure pedestrians and cyclists, according to US insurers.
So common have the new larger SUVs become that Standards Australia is considering increasing the length of a standard parking bay by 20cm. It wants comments by November.
Also taking market share from smaller standard cars are what we in Australia call utes, which are standard vehicles (they used to be Falcons and Commodores) with a built-in tray attached at the rear.
Utes are categorised as commercial vehicles, even though these days they tend to have four doors rather than two. They are also just as likely to be used for moving families as equipment, even if bought with small business tax concessions.
Vehicles defined as commercial, the bulk of them utes, accounted for one in five vehicles sold a decade ago. Now they are one in four, outselling passenger cars.
Tax only explains so much
Cars get special treatment in Australia’s tax system.
If an employer provides them and their private use is “minor, infrequent and irregular”, or if they are utes “not designed for the principal purpose of carrying passengers”, they can can escape the fringe benefits tax.
And from time to time small businesses get offered instant asset writeoffs, which means that all or part of the cost of the car can be written off against tax.
But apart from perhaps helping to explain the increasing preference for utes, these concessions seem insufficient to explain the demise of the standard passenger car and the rise of the expensive (and more expensive to fuel) alternatives.
But, in an information paper, the bureau goes on to note that SUVs “appear to be more likely to kill pedestrians than cars”.
They also appear more likely to kill the occupants of standard cars than standard cars when those cars crash, largely because they are higher – a phenomenon the insurance industry refers to as “incompatibility”.
Australia’s Bureau of Infrastructure and Transport Research Economics refers to this as the “other side of the coin”.
But I think that for buyers of SUVs, it might be the same side of the coin. That is, I think it might be becoming a perverse and macabre argument for buying SUVs.
If SUVs are becoming dominant and they put other road users at risk, it makes sense not to be one of those other road users.
I am not suggesting that danger from SUVs is the only reason for the flood of buyers switching to SUVs. But I am suggesting it has helped contribute to a snowballing in demand for SUVs, along with fashion, and changed views about what’s normal.
I’m not sure what can be done at this stage. Higher petrol prices ought to have helped, but they don’t seem to have.
SUV purchases have increased, even as petrol prices have climbed. Extra taxes have been proposed to help curb road deaths, but they mightn’t help either. SUVs are already expensive.
Tighter standards would help
One thing we ought to do straight away is to shift the burden of decision-making from buyers to makers.
Ideally, those standards would require the entire fleet of vehicles sold by each manufacturer to meet a gradually-tightening average efficiency standard.
Putting more electric vehicles into each fleet would help. But so would increasing the efficiency of its conventionally-powered SUVs – which would mean reducing their weight, and with it, their danger to other people on the road.
The design of the scheme is up for grabs, and the Grattan Institute’s Marion Terrill has made a submission.
She says regardless of the switch to electric cars, Australians are going to be buying petrol and diesel vehicles for some time. That’s why it’s so important those cars become as fuel efficient (and, she could add, as safe) as they can be.
It might sound like science fiction, but “marine cloud brightening” is being seriously considered as a way to shield parts of the ocean from extreme heat.
We’re using water canons to spray seawater into the sky. This causes brighter, whiter clouds to form. These low marine clouds reflect sunlight away from the ocean’s surface, protecting the marine life below from the worst of climate change.
Australia’s Reef Restoration and Adaptation Program – a collaboration between several universities, CSIRO and the Australian Institute of Marine Science – is exploring whether cloud brightening could reduce coral bleaching. As an oceanographer and engineer I lead the program’s research into cooling and shading techniques.
We started exploring cloud brightening after the mass bleaching event in 2016. First, we needed to develop and test the underlying technologies in the lab. Then we began pilot testing in the central Great Barrier Reef near Townsville during January 2020. After several iterations we have now moved beyond “proof of concept” to investigating the response of the clouds themselves.
A bright idea
British cloud physicist John Latham originally proposed cloud brightening in 1990 as a way to control global warming by altering Earth’s energy balance. He calculated that brightening clouds across the most susceptible regions of the world’s oceans could counteract the global warming caused by a doubling of preindustrial atmospheric carbon dioxide. That’s a level likely to be reached by the year 2060.
Recently, scientists have begun to consider regional rather than global application of cloud brightening. Could brightening clouds directly over the Great Barrier Reef for a few months reduce coral bleaching during a marine heat wave?
Modelling studies are encouraging and suggest it could delay the expected decline in coral cover. This could buy valuable time for the reef while the world transitions away from fossil fuels.
Lowering the heat stress on the ecosystem would produce other benefits when combined with other reef interventions – such as improved control of invasive crown of thorns starfish and planting of corals with increased heat tolerance.
But these studies also show there’s a limit to what can be achieved. Long-term benefits are only possible if the cloud brightening activity occurs alongside aggressive emissions reductions.
Cloud brightening does have risks as well as benefits, but the prospect of intermittent regional use is very different to large-scale “solar geo-engineering” proposals for shading and cooling the whole planet.
We expect the regional effect will be short-lived and reversible, which is reassuring. The technology must be operated continuously to modify clouds and could be stopped at any time. The sea salt particles sprayed in the process typically only persist in the atmosphere for one to several days.
How do you brighten a cloud?
A warm cloud (as opposed to an ice cloud) is a collection of small water droplets floating in the air.
A cloud of many small droplets is brighter than one with fewer large droplets – even if both clouds contain the same amount of water overall.
Every droplet begins with the condensation of water vapour around a nucleus, which can be almost any kind of tiny particle suspended in air.
Typically, in the lower atmosphere over land there are thousands to tens of thousands of these tiny particles suspended in every cubic centimetre of air. We call these airborne particles “aerosols”.
Aerosols may be natural such as dust, sea salt, pollen, ash and sulphates. Or they may come from human activity such as burning fossil fuels or vegetation, manufacturing, vehicle exhaust and aerosol spray cans.
When a cloud forms under these conditions, water vapour is forced to condense around fewer nuclei, creating larger droplets and fewer of them. Large droplets reflect less light for the same volume of cloud water.
To brighten such clouds, we can spray large quantities of microscopic seawater droplets into the air. This process of atomising seawater mimics the generation of sea salt aerosols by wind and waves in the ocean. If these are incorporated into a cloud and create extra droplets, the cloud will be brightened.
Although scientists have researched cloud brightening for more than 30 years, no one had ever directly tested the theory. In Australia, we have now developed technology to a point where we are starting to measure the response of the clouds.
We are beginning such tests with the support and permission of Traditional Owners, who have sustainably managed their Sea Country for tens of thousands of years.
Our research program involves more than 15 research institutions and has multiple levels of governance and oversight.
Not so far-fetched
Most people probably don’t realise we are already inadvertently brightening the clouds. The Intergovernmental Panel on Climate Change estimates humanity’s unintentional release of aerosols offsets around 30% of the warming effect due to greenhouse gases.
Sulphates in ship exhaust are such a potent source of aerosols for droplet formation, the passage of ships leaves cloud trails called ship tracks.
When the International Maritime Organisation introduced new rules limiting the sulphur content of marine fuels, the number and extent of ship tracks drastically reduced, especially in the Northern Hemisphere. A recent study even suggests the devastating heat wave that swept the Northern Hemisphere earlier this year was worsened by the absence of ship tracks.
The world-first research we are conducting in Australia aims to determine if we could harness the clouds in an effective, environmentally responsible and socially acceptable manner for the future conservation of one of our most precious ecosystems.