Saturday, 28 October 2023

Junk in Space - the crowded orbit around Earth

 

Space is getting crowded with satellites and space junk. How do we avoid collisions?

NASA ODPO
Sara Webb, Swinburne University of Technology; Brett Carter, RMIT University, and Christopher Fluke, Swinburne University of Technology

Reports this week suggest a near-collision between an Australian satellite and a suspected Chinese military satellite.

Meanwhile, earlier this month, the US government issued the first ever space junk fine. The Federal Communications Commission handed a US$150,000 penalty to the DISH Network, a publicly traded company providing satellite TV services.

It came as a surprise to many in the space industry, as the fine didn’t relate to any recent debris – it was issued for a communications satellite that has been in space for more than 21 years. It was EchoStar-7, which failed to meet the orbit requirements outlined in a previously agreed debris mitigation plan.

The EchoStar-7 fine might be a US first, but it probably won’t be the last. We are entering an unprecedented era of space use and can expect the number of active satellites in space to increase by 700% by the end of the decade.

As our local space gets more crowded, keeping an eye on tens of thousands of satellites and bits of space junk will only become more important. So researchers have a new field for this: space domain awareness.

Three types of orbit, plus junk

Humans have been launching satellites into space since 1957 and in the past 66 years have become rather good at it. There are currently more than 8,700 active satellites in various orbits around Earth.

Satellites tend to be in three main orbits, and understanding these is key to understanding the complex nature of space debris.

An image of Earth with circles around it to indicate the distance of standard satellite orbits
Types of orbits around Earth classified by altitude (not to scale). Pexels/The Conversation, CC BY-SA

The most common orbit for satellites is low Earth orbit, with at least 5,900 active satellites. Objects in low Earth orbit tend to reside up to 1,000km above Earth’s surface and are constantly on the move. The International Space Station is an example of a low Earth orbit object, travelling around Earth 16 times every day.

Higher up is the medium Earth orbit, where satellites sit between 10,000 and 20,000km above Earth. It’s not a particularly busy place, but is home to some of the most important satellites ever launched – they provide us with the global positioning system or GPS.

Finally, we have very high altitude satellites in geosynchronous orbit. In this orbit, satellites are upwards of 35,000km above Earth, in orbits that match the rate of Earth’s rotation. One special type of this orbit is a geostationary Earth orbit. It lies on the same plane as Earth’s equator, making the satellites appear stationary from the ground.

Visualisation of The European Space Agency’s Space Debris Office statistics on space debris orbiting Earth (as of January 8 2021).

As you can tell, Earth’s surrounds are buzzing with satellite activity. It only gets more chaotic when we factor in space junk, defined as disused artificial debris in orbit around Earth.

Space junk can range from entire satellites that are no longer in use or working, down to millimetre-wide bits of spacecraft and launch vehicles left in orbit. Latest estimates suggest there are more than 130 million pieces of space debris, with only 35,000 of those large enough (greater than 10cm) to be routinely tracked from the ground.

How do we track them all?

This is where space domain awareness comes in. It is the field of detecting, tracking and monitoring objects in Earth’s orbit, including active satellites and space debris.

We do much of this with ground-based tracking, either through radar or optical systems like telescopes. While radar can easily track objects in low Earth orbit, higher up we need optical sensors. Objects in medium Earth orbit and geostationary orbit can be tracked using sunlight reflected towards Earth.

For reliable and continuous space domain awareness, we need multiple sensors contributing to this around the globe.

Below you can see what high-altitude satellites can look like to telescopes on Earth, appearing to stay still as the stars move by.

Tracking two Optus satellites 16km apart, using EOS’ 0.7m deep space telescope at Learmonth, Western Australia. Source: EOS - Electro Optic Systems.

Australia’s role in space awareness

Thanks to our position on Earth, Australia has a unique opportunity to contribute to space domain awareness. The US already houses several facilities on the west coast of Australia as part of the Space Surveillance Network. That’s because on the west coast, telescopes can work in dark night skies with minimal light pollution from large cities.

Furthermore, we are currently working on a space domain awareness technology demonstrator (a proof of concept), funded by SmartSat CRC. This is a government-funded consortium of universities and other research organisations, along with industry partners such as the IT firm CGI.

We are combining our expertise in observational astrophysics, advanced data visualisation, artificial intelligence and space weather. Our goal is to have technology that understands what is happening in space minute-by-minute. Then, we can line up follow-up observations and monitor the objects in orbit. Our team is currently working on geosynchronous orbit objects, which includes active and inactive satellites.

EchoStar-7 was just one example of the fate of a retired spacecraft – the FCC is sending a strong warning to all other companies to ensure their debris mitigation plans are met.

Inactive objects in orbit could pose a collision risk to each other, leading to a rapid increase in space debris. If we want to use Earth’s space domain for as long as possible, we need to keep it safe for all.

Acknowledgment: The authors would like to thank Sholto Forbes-Spyratos, military space lead at CGI Space, Defence and Intelligence Australia, for his contribution to this article.The Conversation

Sara Webb, Postdoctoral Research Fellow, Centre for Astrophysics and Supercomputing, Swinburne University of Technology; Brett Carter, Associate Professor, RMIT University, and Christopher Fluke, SmartSat Professorial Chair, Swinburne University of Technology

This article is republished from The Conversation under a Creative Commons license. Read the original article.

Wednesday, 18 October 2023

Australia's obsession with SUVs.

Shutterstock

 

Where did the cars go? How heavier, costlier SUVs and utes took over Australia’s roads

Peter Martin, Crawford School of Public Policy, Australian National University

If we’re upset about the price of petrol, why do we drive the vehicles we do?

SUVs (so-called sport utility vehicles) use more fuel per kilometre than standard cars – according to the International Energy Agency, up to 25% more.

They weigh more than standard cars – about 100 kilograms more.

And they emit more carbon than standard cars. In Australia, medium-size SUVs emit 14% more carbon per kilometre travelled than medium-size cars. Large SUVs emit 30% more than large cars.

Yet we’re buying them at a rate that would have been unimaginable even a decade ago.

SUVs outsell passenger cars 3 to 1

As recently as 2012, more than half the new vehicles sold in Australia were “passenger cars” – the standard low-slung cars of the type we were used to. About one-quarter were SUVs.

Back further, in the early 1990s, three-quarters of the new vehicles we bought were passenger cars, and only 8% SUVs.

Yet after an explosion in SUV sales, today every second vehicle bought is a SUV. In September, SUVs accounted for 58% of new vehicle sales. Passenger cars accounted for just 17%. This means SUVs outsell passenger cars three to one.



Like country music, SUVs are hard to define, but you know one when you see one.

They are distinguished by being high and squarish – the words used in the official definition are “wagon body style and elevated ride height”, and generally big. They are usually four-wheel drives or all-wheel drives.

Standard passenger cars (be they hatches, sedans or wagons) sit closer to the ground, are usually lighter, and are less likely to kill or seriously injure pedestrians and cyclists, according to US insurers.

So common have the new larger SUVs become that Standards Australia is considering increasing the length of a standard parking bay by 20cm. It wants comments by November.

Also taking market share from smaller standard cars are what we in Australia call utes, which are standard vehicles (they used to be Falcons and Commodores) with a built-in tray attached at the rear.

1971 Holden Ute. Shutterstock

Utes are categorised as commercial vehicles, even though these days they tend to have four doors rather than two. They are also just as likely to be used for moving families as equipment, even if bought with small business tax concessions.

Australia’s National Transport Commission is so concerned about the rise in sales of both SUVs and utes, it warns they are “tempering Australia’s improvement in transport emissions”.

Vehicles defined as commercial, the bulk of them utes, accounted for one in five vehicles sold a decade ago. Now they are one in four, outselling passenger cars.



Tax only explains so much

Cars get special treatment in Australia’s tax system.

If an employer provides them and their private use is “minor, infrequent and irregular”, or if they are utes “not designed for the principal purpose of carrying passengers”, they can can escape the fringe benefits tax.

And from time to time small businesses get offered instant asset writeoffs, which means that all or part of the cost of the car can be written off against tax.

But apart from perhaps helping to explain the increasing preference for utes, these concessions seem insufficient to explain the demise of the standard passenger car and the rise of the expensive (and more expensive to fuel) alternatives.

Australia’s Bureau of Infrastructure and Transport Research Economics identifies the obvious: headroom, legroom and storage space, as well as the ability to drive on bad roads as well as good.

Danger is a perverse selling point

But, in an information paper, the bureau goes on to note that SUVs “appear to be more likely to kill pedestrians than cars”.

They also appear more likely to kill the occupants of standard cars than standard cars when those cars crash, largely because they are higher – a phenomenon the insurance industry refers to as “incompatibility”.

Australia’s Bureau of Infrastructure and Transport Research Economics refers to this as the “other side of the coin”.

But I think that for buyers of SUVs, it might be the same side of the coin. That is, I think it might be becoming a perverse and macabre argument for buying SUVs.

If SUVs are becoming dominant and they put other road users at risk, it makes sense not to be one of those other road users.



I am not suggesting that danger from SUVs is the only reason for the flood of buyers switching to SUVs. But I am suggesting it has helped contribute to a snowballing in demand for SUVs, along with fashion, and changed views about what’s normal.

I’m not sure what can be done at this stage. Higher petrol prices ought to have helped, but they don’t seem to have.

SUV purchases have increased, even as petrol prices have climbed. Extra taxes have been proposed to help curb road deaths, but they mightn’t help either. SUVs are already expensive.

Tighter standards would help

One thing we ought to do straight away is to shift the burden of decision-making from buyers to makers.

The federal government is about to roll out long-overdue fuel efficiency standards, of the kind already common in the rest of the world.

Ideally, those standards would require the entire fleet of vehicles sold by each manufacturer to meet a gradually-tightening average efficiency standard.

Putting more electric vehicles into each fleet would help. But so would increasing the efficiency of its conventionally-powered SUVs – which would mean reducing their weight, and with it, their danger to other people on the road.

The design of the scheme is up for grabs, and the Grattan Institute’s Marion Terrill has made a submission.

She says regardless of the switch to electric cars, Australians are going to be buying petrol and diesel vehicles for some time. That’s why it’s so important those cars become as fuel efficient (and, she could add, as safe) as they can be.The Conversation

Peter Martin, Visiting Fellow, Crawford School of Public Policy, Australian National University

This article is republished from The Conversation under a Creative Commons license. Read the original article.

Tuesday, 17 October 2023

Is Environmental engineering the answer ?

Southern Cross University

 Could ‘marine cloud brightening’ reduce coral bleaching on the Great Barrier Reef?

Daniel Patrick Harrison, Southern Cross University

It might sound like science fiction, but “marine cloud brightening” is being seriously considered as a way to shield parts of the ocean from extreme heat.

We’re using water canons to spray seawater into the sky. This causes brighter, whiter clouds to form. These low marine clouds reflect sunlight away from the ocean’s surface, protecting the marine life below from the worst of climate change.

Australia’s Reef Restoration and Adaptation Program – a collaboration between several universities, CSIRO and the Australian Institute of Marine Science – is exploring whether cloud brightening could reduce coral bleaching. As an oceanographer and engineer I lead the program’s research into cooling and shading techniques.

We started exploring cloud brightening after the mass bleaching event in 2016. First, we needed to develop and test the underlying technologies in the lab. Then we began pilot testing in the central Great Barrier Reef near Townsville during January 2020. After several iterations we have now moved beyond “proof of concept” to investigating the response of the clouds themselves.

The Cloud Brightening Field Trip of 2021 (Southern Cross University)

A bright idea

British cloud physicist John Latham originally proposed cloud brightening in 1990 as a way to control global warming by altering Earth’s energy balance. He calculated that brightening clouds across the most susceptible regions of the world’s oceans could counteract the global warming caused by a doubling of preindustrial atmospheric carbon dioxide. That’s a level likely to be reached by the year 2060.

Recently, scientists have begun to consider regional rather than global application of cloud brightening. Could brightening clouds directly over the Great Barrier Reef for a few months reduce coral bleaching during a marine heat wave?

Modelling studies are encouraging and suggest it could delay the expected decline in coral cover. This could buy valuable time for the reef while the world transitions away from fossil fuels.

Lowering the heat stress on the ecosystem would produce other benefits when combined with other reef interventions – such as improved control of invasive crown of thorns starfish and planting of corals with increased heat tolerance.

But these studies also show there’s a limit to what can be achieved. Long-term benefits are only possible if the cloud brightening activity occurs alongside aggressive emissions reductions.

Cloud brightening does have risks as well as benefits, but the prospect of intermittent regional use is very different to large-scale “solar geo-engineering” proposals for shading and cooling the whole planet.

We expect the regional effect will be short-lived and reversible, which is reassuring. The technology must be operated continuously to modify clouds and could be stopped at any time. The sea salt particles sprayed in the process typically only persist in the atmosphere for one to several days.

A photo from the university's aircraft looking down at the Great Barrier Reef
Southern Cross University’s aerosol and cloud microphysics aircraft operating over the Southern Great Barrier Reef. Southern Cross University

How do you brighten a cloud?

A warm cloud (as opposed to an ice cloud) is a collection of small water droplets floating in the air.

A cloud of many small droplets is brighter than one with fewer large droplets – even if both clouds contain the same amount of water overall.

Every droplet begins with the condensation of water vapour around a nucleus, which can be almost any kind of tiny particle suspended in air.

Typically, in the lower atmosphere over land there are thousands to tens of thousands of these tiny particles suspended in every cubic centimetre of air. We call these airborne particles “aerosols”.

Aerosols may be natural such as dust, sea salt, pollen, ash and sulphates. Or they may come from human activity such as burning fossil fuels or vegetation, manufacturing, vehicle exhaust and aerosol spray cans.

In very clean maritime air, the aerosols available to form clouds are mainly sulphates and sea salt crystals. And they are few and far between, only a few hundred per cubic centimetre.

When a cloud forms under these conditions, water vapour is forced to condense around fewer nuclei, creating larger droplets and fewer of them. Large droplets reflect less light for the same volume of cloud water.

To brighten such clouds, we can spray large quantities of microscopic seawater droplets into the air. This process of atomising seawater mimics the generation of sea salt aerosols by wind and waves in the ocean. If these are incorporated into a cloud and create extra droplets, the cloud will be brightened.

Sea salt also provides additional shade by direct scattering of light.

Photo of the latest cloud brightening generator (V model) in action, on board a vessel, with a person standing alongside it. The cannon is about as tall as the person.
The latest cloud brightening generator (V model) in action. Southern Cross University

Testing the theory

Although scientists have researched cloud brightening for more than 30 years, no one had ever directly tested the theory. In Australia, we have now developed technology to a point where we are starting to measure the response of the clouds.

We are beginning such tests with the support and permission of Traditional Owners, who have sustainably managed their Sea Country for tens of thousands of years.

Our research program involves more than 15 research institutions and has multiple levels of governance and oversight.

Not so far-fetched

Most people probably don’t realise we are already inadvertently brightening the clouds. The Intergovernmental Panel on Climate Change estimates humanity’s unintentional release of aerosols offsets around 30% of the warming effect due to greenhouse gases.

Sulphates in ship exhaust are such a potent source of aerosols for droplet formation, the passage of ships leaves cloud trails called ship tracks.

When the International Maritime Organisation introduced new rules limiting the sulphur content of marine fuels, the number and extent of ship tracks drastically reduced, especially in the Northern Hemisphere. A recent study even suggests the devastating heat wave that swept the Northern Hemisphere earlier this year was worsened by the absence of ship tracks.

The world-first research we are conducting in Australia aims to determine if we could harness the clouds in an effective, environmentally responsible and socially acceptable manner for the future conservation of one of our most precious ecosystems.The Conversation

Daniel Patrick Harrison, Senior Lecturer, Southern Cross University

This article is republished from The Conversation under a Creative Commons license. Read the original article.

Tuesday, 10 October 2023

What chance for humanity ?

 Is there really a 1 in 6 chance of human extinction this century?

Shutterstock
Steven Stern, Bond University

In 2020, Oxford-based philosopher Toby Ord published a book called The Precipice about the risk of human extinction. He put the chances of “existential catastrophe” for our species during the next century at one in six.

It’s quite a specific number, and an alarming one. The claim drew headlines at the time, and has been influential since – most recently brought up by Australian politician Andrew Leigh in a speech in Melbourne.

It’s hard to disagree with the idea we face troubling prospects over the coming decades, from climate change, nuclear weapons and bio-engineered pathogens (all big issues in my view), to rogue AI and large asteroids (which I would see as less concerning).

But what about that number? Where does it come from? And what does it really mean?

Coin flips and weather forecasts

To answer those questions, we have to answer another first: what is probability?

The most traditional view of probability is called frequentism, and derives its name from its heritage in games of dice and cards. On this view, we know there is a one in six chance a fair die will come up with a three (for example) by observing the frequency of threes in a large number of rolls.

Or consider the more complicated case of weather forecasts. What does it mean when a weatherperson tells us there is a one in six (or 17%) chance of rain tomorrow?

It’s hard to believe the weatherperson means us to imagine a large collection of “tomorrows”, of which some proportion will experience precipitation. Instead, we need to look at a large number of such predictions and see what happened after them.

If the forecaster is good at their job, we should see that when they said “one in six chance of rain tomorrow”, it did in fact rain on the following day one time in every six.

So, traditional probability depends on observations and procedure. To calculate it, we need to have a collection of repeated events on which to base our estimate.

Can we learn from the Moon?

So what does this mean for the probability of human extinction? Well, such an event would be a one-off: after it happened, there would be no room for repeats.

Instead, we might find some parallel events to learn from. Indeed, in Ord’s book, he discusses a number of potential extinction events, some of which can potentially be examined in light of a history.

A photo of the Moon with craters highlighted.
Counting craters on the Moon can gives us clues about the risk of asteroid impacts on Earth. NASA

For example, we can estimate the chances of an extinction-sized asteroid hitting Earth by examining how many such space rocks have hit the Moon over its history. A French scientist named Jean-Marc Salotti did this in 2022, calculating the odds of an extinction-level hit in the next century at around one in 300 million.

Of course, such an estimate is fraught with uncertainty, but it is backed by something approaching an appropriate frequency calculation. Ord, by contrast, estimates the risk of extinction by asteroid at one in a million, though he does note a considerable degree of uncertainty.

A ranking system for outcomes

There is another way to think about probability, called Bayesianism after the English statistician Thomas Bayes. It focuses less on events themselves and more on what we know, expect and believe about them.

In very simple terms, we can say Bayesians see probabilities as a kind of ranking system. In this view, the specific number attached to a probability shouldn’t be taken directly, but rather compared to other probabilities to understand which outcomes are more and less likely.

Ord’s book, for example, contains a table of potential extinction events and his personal estimates of their probability. From a Bayesian perspective, we can view these values as relative ranks. Ord thinks extinction from an asteroid strike (one in a million) is much less likely than extinction from climate change (one in a thousand), and both are far less likely than extinction from what he calls “unaligned artificial intelligence” (one in ten).

The difficulty here is that initial estimates of Bayesian probabilities (often called “priors”) are rather subjective (for instance, I would rank the chance of AI-based extinction much lower). Traditional Bayesian reasoning moves from “priors” to “posteriors” by again incorporating observational evidence of relevant outcomes to “update” probability values.

And once again, outcomes relevant to the probability of human extinction are thin on the ground.

Subjective estimates

There are two ways to think about the accuracy and usefulness of probability calculations: calibration and discrimination.

Calibration is the correctness of the actual values of the probabilities. We can’t determine this without appropriate observational information. Discrimination, on the other hand, simply refers to the relative rankings.

We don’t have a basis to think Ord’s values are properly calibrated. Of course, this is not likely to be his intent. He himself indicates they are mostly designed to give “order of magnitude” indications.

Even so, without any related observational confirmation, most of these estimates simply remain in the subjective domain of prior probabilities.

Not well calibrated – but perhaps still useful

So what are we to make of “one in six”? Experience suggests most people have a less than perfect understanding of probability (as evidenced by, among other things, the ongoing volume of lottery ticket sales). In this environment, if you’re making an argument in public, an estimate of “probability” doesn’t necessarily need to be well calibrated – it just needs to have the right sort of psychological impact.

From this perspective, I’d say “one in six” fits the bill nicely. “One in 100” might feel small enough to ignore, while “one in three” might drive panic or be dismissed as apocalyptic raving.

As a person concerned about the future, I hope risks like climate change and nuclear proliferation get the attention they deserve. But as a data scientist, I hope the careless use of probability gets left by the wayside and is replaced by widespread education on its true meaning and appropriate usage.The Conversation

Steven Stern, Professor of Data Science, Bond University

This article is republished from The Conversation under a Creative Commons license. Read the original article.

Tuesday, 10 January 2023

Up in the cloud - data management in the 21st Century

                                                                                              Shutterstock
Cloud computing is one of the current buzz words in the ICT sector and like agile working (or hot desking) its been sold to the employees, consumers and the general public as the next best thing for flexibility, cost effectiveness and efficency. However in this age of hacking, denial of service attacks and related data risk issues, is it really more spin than substance ?  What exactly is the 'cloud' ?

In simple terms, the definition of cloud computing is using a network of remote servers accessed via the internet to store, manage and process data rather than a local server or a personal computer. The advantages of cloud computing are promoted as being: no capital outlay compared to a business having to buy its own hardware; fewer IT specialists are needed to maintain and secure systems; and on-demand access to scalable computing resources suitable for any form or size of organisation. The principal disadvantage is less direct control over the computing infrastructure that runs systems and a reliance on a third party provider whom may be located in a different continent. 

There is also a criticism that cloud systems are encouraging much more energy intense activity given there is less reliance on local organisational computing and data storage capacity. To counter this impact, the use of green energy and carbon neutral programs are being used by some large cloud providers. Unfortunately much of the carbon neutral programs are little more than purchasing renewable energy certificates rather than taking direct action to reduce energy emissions.

Cloud services are delivered in three forms -
  • system infrastructure as a service: being the processing power and data storage capability, 
  • host and deploy applications for businesses: known as 'platform as a service',
  • centrally hosted and managed software for businesses.
Cloud services can be accessed in different modes such as a public cloud (where computing services are delivered over the public internet to various companies and users). This model operates on the basis that multiple companies shared pooled resources through a group of servers although the data of each individual company is kept hidden from other users. Another model is the private cloud where there are dedicated computing resources to a single organisation over the public internet or a high speed link. While an organisation would have greater control over a private cloud, there would be higher costs for this control and security.

Major 'cloud' outages have occured in 2022 including incidents involving Google, Microsoft and Oracle and while the outages are considered to take a longer time to resolve, the impact is often described as being less severe than when a company has an internal IT failure. 

Technology research firm, Garnter, has forecast that "..worldwide end-user spending on public cloud services if forecast to grow 20.7 percent to a total of $591.8 billion in 2023 this being a clear increase from $490.3 billion in 2022".

The move to improve costs, efficiency and service quality through the "cloud" may yet prove to be a very high price when considering the loss of independence and sustainability.

Saturday, 31 December 2022

Climate change - the value of tropical forests in controlling temperature

Tropical forest - Shutterstock
Most research and published articles on forests and global warming focus on the capacity for carbon sequestration as well as current retained levels of carbon within the forests. This is particularly relevant when considering the climate effects that occur when forest cover, structure and composition change as a result of deforestation. However there are more factors to consider with the role of tropical forests than just carbon sequestration.

Scientists from the University of Virginia, USA, the Woodrell Climate Research Centre, USA and the International Centre for Tropical Agriculture, Colombia have published compelling research this year on the role of tropical forests in regulating temperature, environment and removal of CO2 emissions. Forests are responsible for much of the carbon removal together with terrestrial ecosystems which amount in total to 29 % of annual carbon emissions. As the researchers have commented "the biophysical effects of forest cover can contribute significantly to solving local adaptation challenges, such as extreme heat and flooding, at any latitude. The carbon benefits of forests at any latitude contribute meaningfully to global climate mitigation".

Key aspects of the report have found -
  • forests contain over 800 PgC (petagrams of carbon) almost as much as currently stored in the atmosphere,
  • tropical forests have one of the fastest carbon sequestration rates per unit land area,
  • forests impact on climate directly through controls on three main biophysical mechanisms: albedo (the fraction of light that a surface reflects), evapotranspiration (ET) and canopy roughness, 
  • in the tropics, where ET and roughness are dominant biophysical drivers, forests cool the lower atmosphere and provide water vapour to support cloud formation,
  • forests partition incoming solar radiation between latent heat and sensible heat: "Deep roots and high leaf area make forests very efficient at moving water from the land surface to the atmosphere via ET, producing latent heat. Thus beneath the forest canopy, the sensible heat flux and associated surface temperature are relatively low especiually the gorwing season when ET is high",
  • the role of forests in maintaining critical habitat for biodiversity is well known but there is now new research on extinction that "confirms the role of forests in maintaining critical climates to support biodiversity. Changes in maximum temperature are driving extinction not changes in average temperature",
  • forests minimise risks due to drought associated with heat extremes. "A combination of deep roots, high water use efficiency and high surface roughness allow trees to continue transpiring during drought conditions and thus to dissipate heat and convey moisture to the atmosphere'
The full research article can be accessed at this link: Deforestation: biophysical effects on climate

Thursday, 29 December 2022

After the pandemic working from home remains the new norm

                                                                                                  Shutterstock
Over the past two years, employers and business groups have, by necessity of the SARS-CoV-2 pandemic, needed to adjust  their work models to incorporate a new hybrid of on-worksite attendance by employees and working-from-home. As part of the process, numerous surveys have been conducted by employers and management consultancies as to what the future may look like. One of these research projects by PriceWaterhouse Coopers found the preferences of the Australian workforce to be distributed thus -
  • 16 % a wholly virtual workplace
  • 25 % mostly virtual work with some face-to-face
  • 35 % a mix of face-to-face and virtual work
  • 10 % a traditional face-to-face work environment
  • 14 % mostly face-to-face with some remote working 
What is clear from research into the new world of work and employment is that the proverbial genie is out of the bottle with the benefits of working from home now becoming crystal clear for much of the workforce. Simply returning to the office full-time is no longer sufficient for employees and working flexibly is the order of the day.