Sunday, 26 April 2026

Artificial intelligence Part 6: specific industry impacts - healthcare

ChatGPT image
The impact of AI on healthcare is more nuanced and varied than most other white collar sectors. Healthcare is more complex due to need to retain a strong physical human presence in the medical and care functions that cannot be automated or replaced by digital technology. Healthcare has a complex set of regulations, legal liability and an irreducable human dimension with doctors, nurses, allied health professionals required for direct patient face-to-face contact inclusive of the use of tele-health services.

In contrast, administrative and diagnostic supportive functions are highly exposed. A summary is provided below and is not exhaustive -

Clinical diagnosis and decision support
  • Radiology already has AI systems that match or exceed radiologists in detecting certain cancers (breast, lung, skin). The aspect of concern is potential over diagnosis due to the sensitivity of the digital systems used. The radiologist role is shifting towards oversight, complex cases handling and AI exception management. The potential risk in this field being the volume of radiologists needed to undertake radiology functions may reduce.
  • Pathology has a similar pattern to radiology as AI can analyse tissue samples at scale. The role of pathologists however is not removed at this time but is being augmented.
  • Diagnostic support to General Practitioners can be provided through AI tools that synthesise patient medical history, symptoms and test results. These AI tools are being deployed already however the intention is to support the medical service provided by doctors to patients not substitute it. A secondary intention is to reduce the need for specialist referrals however this has yet to be achieved.
  • Dermatology and ophthalmology are two specialties that are heavily dependent on pattern recognition and will face some AI encroachment, however as with other diagnositic tools it may be a supportive function not a medical role replacement one. 
Clinical administrative functions and documentation
  • Medical transcription is already largely automated with voice-to-text using clinical AI being widely used.
  • Clinical note writing is being addressed by ambient AI scribes such as Nuance DAX. Documentation can consume 30-40% of physician time and assists medical practitioners to achieve quality of life improvement however it reduces medical transcription services significantly, if not in some cases, entirely.
  • Prior authorisation, coding and billing  are very large cost centres and are being progressively automated and threatening large administrative workforces in hospitals and insurance companies.
Nursing and Allied Health
  • Triage and patient monitoring: AI can monitor patient vital signs, identity and alert to deterioriation and prioritise nursing care. AI provides service augmentation but not replacement of front line nursing care which must be physically provided.
  • Care coordination roles do face pressure from AI that can track patient journeys, identify gaps and schedule follow-ups. At this time however this remains an augmentation tool rather than a job replacement one. 
  • Bedside care, emotional support and physical nursing are strongly human services and cannot be replaced by AI. It remains one of the most protected areas across all industries. 
Pharmaceuticals and medical science/research
  • Drug discovery timelines are being compressed by AI which reduces some research roles but creates new roles in AI-guided drug design. An example of AI impact is Alphaford's protein structural predictions which transformed structural biology.
  • Clinical trial design and patient matching is being assisted by AI but not replaced by it. 
In healthcare, it is the administrative organisational pyramid that is being compressed with headcount reduction. Clinical roles continue with signifcantly increased volumes of patients possibly over time.

Friday, 24 April 2026

ANZAC Day 2026

ChatGPT image

ANZAC Day continues to have strong public support in recognition of the service of men and women during time of war. This special commemorative day has been held for 110 years on the 25th April and was originally intented to honour the members of the Australian and New Zealand Army Corps who served in the Gallipoli campaign in 1915 (World War I). Since that time, it has expanded to include other conflicts and peace keeping operations until the present time. On this day, those who lost their lives as a result of their service are particularly remembered.

Lest we forget

Wednesday, 22 April 2026

Science: the mind and imagination

 

How does imagination really work in the brain? New theory upends what we knew

Grandfailure/Getty Images





















Thomas Pace, University of the Sunshine Coast and Roger Koenig-Robert, University of Technology Sydney; UNSW Sydney

Your brain is currently expending about a fifth of your body’s energy, and almost none of that is being used for what you’re doing right now. Reading these words, feeling the weight of your body in a chair – all of this together barely changes the rate at which your brain consumes energy, perhaps by as little as 1%.

The other 99% is used on the activity the brain generates on its own: neurons (nerve cells) firing and signalling to each other regardless of whether you’re thinking hard, watching television, dreaming, or simply closing your eyes.

Even in the brain areas dedicated to vision, the visuals coming in through your eyes shape the activity of your neurons less than this internal ongoing action.

In a paper just published in Psychological Review, we argue that our imagination sculpts the images we see in our mind’s eye by carving into this background brain activity. In fact, imagination may have more to do with the brain activity it silences than with the activity it creates.

Imagining as seeing in reverse

Consider how “seeing” is understood to work. Light enters the eyes and sparks neural signals. These travel through a sequence of brain regions dedicated to vision, each building on the work of the last.

The earliest regions pick out simple features such as edges and lines. The next combine those into shapes. The ones after that recognise objects, and those at the top of the sequence assemble whole faces and scenes.

Neuroscientists call this “feedforward activity” – the gradual transformation of raw light into something you can name, whether it’s a dog, a friend, or both.

In brain science, the standard view is that visual imagination is this original seeing process run in reverse, from within your mind rather than from light entering your eyes.

So, when you hold the face of a friend in mind, you start with an abstract idea of them – a memory or a name, pulled from the filing cabinet of regions that sit beyond the visual system itself.

That idea travels back down through the visual sequence into the early visual areas, which serve as your brain’s workshop where a face would normally be reconstructed from its parts – the curve of a jawline, the specific shade of an eye. These downward signals are called “feedback activity”.

A signal through the static

However, prior research shows this feedback activity doesn’t drive visual neurons to fire in the same way as when you actually see something.

At least in the brain regions early in the vision process, feedback instead modulates brain activity. This means it increases or decreases the activity of the brain cells, reshaping what those neurons are already doing.

Even behind closed eyes, early visual brain areas keep producing shifting patterns of neural activity resembling those the brain uses to process real vision.

Imagination doesn’t need to build a face from scratch. The raw material is already there. In the internal rumblings of your visual areas, fragments of every face you know are drifting through at low volume. Your friend’s face, even now, is passing through in pieces, scattered and unrecognised. What imagining does is hold still the currents that would otherwise carry those pieces away.

All that’s needed is a small, targeted suppression of neurons that are pulled by brain activity in a different direction, and your friend’s face settles out of the noise, like a signal carving its way through static.

Steering the brain

In mice, artificially switching on as few as 14 neurons in a sensory brain region is enough for the animal to notice it and lick a sugar-water spout in response. This shows how small an intervention in the brain can be while still steering behaviour.

While we don’t know how many neurons are needed to steer internal activity into a conscious experience of imagination in humans, growing evidence shows the importance of dampening neural activity.

In our earlier experiments, when people imagined something, the fingerprint it left on their behaviour matched suppression of neuronal activity – not firing. Other researchers have since found the same pattern.

Other lines of evidence strengthen our theory, too. About one in 100 people have aphantasia, which means they can’t form mental images at all. One in 30 form these images so vividly they approach the intensity of images we actually see, known as hyperphantasia.

Research has found that people with weaker mental imagery have more excitable early visual areas, where neurons fire more readily on their own. This is consistent with a visual system whose spontaneous patterns are harder to hold in shape.

Taking all this together, the spontaneous activity reshaping hypothesis – our new theory that imagination carves images out of the steady stream of ongoing brain activity – explains why imagination usually feels weaker than sight. It also explains why we rarely lose track of which is which.

Visual perception arrives with a strength and regularity the brain’s own internal patterns don’t match. Imagination works with those patterns rather than against them, reshaping what is already there into something we can almost see.The Conversation

Thomas Pace, Researcher and Lecturer at the Thompson Institute, University of the Sunshine Coast and Roger Koenig-Robert, Senior Research Fellow, Graduate School of Health, University of Technology Sydney; UNSW Sydney

This article is republished from The Conversation under a Creative Commons license. Read the original article.

Earth Day 2026

 

Sunday, 19 April 2026

Artificial intelligence Part 5: specific industry impacts - finance and banking

ChatGPT image
AI is particularly suited to back office operations in banks and financial institutions analysing large amounts of financial data. Typically banks and financial institutions employ large numbers of people to undertake functions such as compliance documentation, fraud review, transaction monitoring, credit analysis and financial reporting. AI systems designed by Palantir Technologies and SAS Institute for example, can review financial data and identify anomalies much faster than manual teams. The impact of AI on specific industry segments is summarised as follows -

Investment banking and capital markets
  • Analyst roles are vulnerable to AI systems. The business folklore of junior bankers working 100 hour weeks using Excel models, pitch books and risk management due diligence is already under severe pressure as these tasks are highly structured and can be executed by AI. Examples already known include Goldman Sachs and JPMorgan deploying AI for financial modelling, earnings analysis and report generation.
  • Equities market research has been transformed as AI can monitor thousands of stocks, synthesise earnings and generate initial research notes faster than any human team. 
Asset management
  • Quantative analysis and factor modelling can be easily augmented by AI and is increasingly occuring already. This situation is leading to a changing and evolving role for quantitative analysts.
  • Portfolio reporting and client communication is increasingly being automated with AI at the commodity end.
  • Active investment funds management comes under further pressure as passive funds are now better guided by AI-driven strategies.
  • Compliance reporting which is a very large cost centre in financial markets is being substantially automated with AI. The use of automation was an existing trend for many years but AI enables a faster rate of uptake. 
Retail and commercial banking
  • Loans underwriting is already largely algorithmic and automated for retail consumers and the SME business level already. AI does not alter the trend but merely further reduces the remaining human review layer.
  • Customer service and branch banking continues a long decline with face-to-face service reduction. This situation however is subject to fluctuations due to community pressure and increasing consumer preferences for personal interaction for specific services. AI's influence is limited in this line of business activity.
  • Fraud detection and ani-money laundering (AML) monitoring is already within the AI-dominated sphere. Human reviewers have been shifting to exception handling only.
  • Financial advice at the mass market level  already has limited use of robo-advisers. This segment is however subject to regulation and government oversight and the requirement for financial advice licenses, accountability and legal liability. The use of robo-advisers beyond limited information provision and recommendations for the mass retail market has not yet occured. High-net individuals particularly prefer human advisers and personal banking managers rather than an automated service. Various financial advice scandals in the sector may also limit the use of AI for the time being.
As with all industries, the use of AI in finance and banking is most easily implemented in large data analysis, administrative and reporting tasks. It is not well suited to client relationships and regulatory, legal and compliance responsibilities.

Saturday, 18 April 2026

Artificial intelligence Part 4: specific industry impacts - graphic arts and visual design

ChatGPT image
Graphic arts and visual design are another industry that is heavily exposed to AI particularly with impacts such as hierarchical pyramid compression. Tasks and projects that once needed a team of junior artists can now be completed by a single art director using AI tools. AI image systems can now produce concept art, advertising visuals, books covers, storyboards and marketing graphics. Specific industry segments affected are discussed as follows -

Commercial illustration and stock art
  • Stock photography and illustration is already heavily impacted. Companies such as Shutterstock, Getty Images and Adobe all now offer AI image generation. The market for generic commercial illustration has largely collapsed for independent artists.
  • Illustrators who designed books covers, editorial art and advertising assets, mainly mid-tier commercial work, now face severe income compression. This blog uses AI generated images having once held accounts with commercial image suppliers such as Shutterstock.
Advertising and brand design
  • Mood boards, concept art and campaign mockups are increasingly AI-generated at the brief stage.
  • The jobs of junior designers whose purpose is to execute pixel-perfect images under senior creative direction are now heavily at risk as these tasks are automatable.
UI/UX design
  •  AI tools: Such as Figma AI can automate layout generation, component creation and user flow suggestions. Junior UI designers who develop wirseframes face significant automation pressure.
  • UX research such as interviews, synthesis and insight generation remain more protected however even parts of these processes such as synthesis and pattern recognition can be managed through AI.
The multi-part series covering AI, published in this blog, has been researched and compiled using Claude ai (Anthropic), ChatGPT (OpenAI), and Grok (Xai). 

Wednesday, 15 April 2026

Climate change - Antarctic Emperor Penguins and fur seals now endangered species

 

The beloved emperor penguin and Antarctic fur seal are now officially endangered. Here’s what can be done

The Conversation, CC BY-ND
Mary-Anne Lea, University of Tasmania; Jane Younger, University of Tasmania, and Noemie Friscourt, University of Tasmania

In 1902, British explorer Robert Falcon Scott spotted a large group of large black and white birds at Ross Island, Antarctica. This was among the many milestones of Scott’s famous Discovery expedition: the first breeding colony of emperor penguins.

Now, only 124 years since this penguin colony was discovered, emperor penguins have officially been listed as endangered, along with the Antarctic fur seal. As the world warms, Antarctic krill are shifting southwards and sea ice is shrinking at record levels. And these unprecedented changes are having a domino effect on these species.

These are the first penguin and pinniped – marine mammals that have front and rear flippers – to be given this conservation status in the Southern Ocean. Their perilous situation is a critical turning point, and shows how rapidly the Antarctic environment is changing.

At the same time, the spread of highly contagious avian influenza, or bird flu, adds a new and immediate threat to Southern Ocean wildlife, compounding the pressures of climate change on stressed species.

Antarctic fur seal with pups at Sailsbury Plain on South Georgia, with snow-covered hills in the background.
Antarctic fur seal with pups at Sailsbury Plain on South Georgia. The number of fur seals has dropped by over 50% since 1999. Posnov/Getty

Dramatic declines linked to climate change

The first emperor penguin breeding colony was discovered at Cape Crozier, on Ross Island, during Robert Falcon Scott’s Discovery expedition in 1902. A decade later, Scott’s Terra Nova expedition returned, in part to collect emperor penguin eggs. It was an ill-fated expedition, immortalised in Apsley Cherry-Garrard’s famous book, The Worst Journey in the World.

In the 1960s, Scott’s son, Sir Peter Scott, one of the founders of modern conservation, helped establish the International Union for the Conservation of Nature’s Red List. Just 124 years after those early discoveries at Cape Crozier, that same framework has now been used to classify emperor penguins as endangered. The swift arc from discovery to extinction risk is a striking reminder of how quickly the species’ fortunes have changed.

Over nine years, between 2009 and 2018, emperor penguin numbers fell by 10%. Their numbers are expected to halve by 2073.

A group of southern elephant seals at rest.
Southern elephant seals are now officially listed as vulnerable. Mary-Anne Lea, CC BY-ND

The decline is more pronounced for Antarctic fur seals. Hunted to the brink of extinction in the early 1880s, by 1999 their numbers had rebounded to an estimated 2.1 million mature seals. But since then, the global population has decreased by more than 50%, to about 944,000 mature individuals.

In just a decade, they have been reclassified on the IUCN’s Red List, going from of “least concern” – those species that are widespread and at low risk of extinction – to “endangered”. The IUCN’s red list is the comprehensive information source on the extinction risk status of species. This shows the remarkable speed at which these seals are declining.

Climate change and bird flu

Both of these dramatic declines are linked to climate change. Warming ocean temperatures and a reduction in sea ice affect the availability of the Antarctic fur seal’s key prey, Antarctic krill. Krill are shifting southwards and moving deeper, potentially making them less accessible to some predators. Competition with a growing population of whales has also increased.

Emperor penguins, by contrast, are completely dependent on sea ice. They use it as a stable platform for courtship, incubating their eggs and rearing chicks. But as sea ice declines and becomes less reliable, their breeding success is increasingly threatened. If the ice breaks up before chicks are fully developed, many are unable to survive.

At the same time, the spread of highly contagious bird flu adds a new and immediate threat to Southern Ocean wildlife. High mortality associated with avian influenza has also caused the uplisting of the southern elephant seal to “vulnerable” this week.

Some elephant seal populations have experienced more than 90% of pups dying, alongside sharp declines in breeding adults. These represent tens of thousands of animals lost, with many Antarctic fur seals also dying as a result of bird flu outbreaks.

emperor penguin chicks at Cape Crozier.
Emperor penguin chicks at Cape Crozier. Mary-Anne Lea, CC BY-ND

We need to know more

Emperor penguins, Antarctic fur seals and southern elephant seals are three of the more widely researched Southern Ocean predators. But there is still a lot we don’t know, because of the remote location and the difficulty of sustaining research over time. And there are many species we know far less about. Antarctic ice seals, including Weddell seals, crabeater seals, leopard seals, and Ross seals, have “unknown” population trends on the IUCN red list, meaning there is not enough data to know if numbers are declining.

These recent listings make clear the urgent and ongoing need for improved, real-time monitoring. We need to know much more about wildlife health and population trends, the Antarctic environment and sea ice quality.

Human-driven threats facing Antarctic wildlife are many, and cumulative. To respond, we need to better protect Antarctic habitat and the species that live there. We need to reduce the interaction of marine species with industrial fishing. And we must improve how we assess current and suspected threats in Antarctica, when there is growing evidence of impacts.

Defining these animals as endangered is a stark reminder of how quickly Antarctica is changing before our eyes. Without a rapid reduction in greenhouse gas emissions and sustained conservation action, these species may be lost forever.The Conversation

Mary-Anne Lea, Professor in Marine/Polar Predator Ecology, University of Tasmania; Jane Younger, Senior Lecturer in Southern Ocean Vertebrate Ecology, Institute for Marine and Antarctic Studies, University of Tasmania, and Noemie Friscourt, Research Associate, Institute for Marine and Antarctic Studies, University of Tasmania

This article is republished from The Conversation under a Creative Commons license. Read the original article.

Sunday, 12 April 2026

Artificial intelligence Part 3: specific industry impacts - film and television

ChatGPT image
The impact of AI is the most pronounced in the film and television industry with a variety of occupations impacted by the technology. The WGA and SAG-AFTRA union strikes in the United States in 2023 highlighted the concerns of people employed in the creative industries. Breaking down the various subsectors in the film and television industry, the role of AI can be easily defined -

CGI and VFX production
AI now covers environmental generation, crowd simulation, rotoscoping, motion cleanup, texture creation, background characters. 
  • Rotoscoping, cleanup and compositing are traditionally large pools of junior labour and these tasks are being automated rapidly. Mid-tier VFX companies are under existential pressure with work bifurcating toward very high-end boutique work but the commodity work is fully AI generated. The roles that are disappearing are junior asset builders, repetitive compositing roles and the large teams that produce background elements.
  • AI tools are Unreal Engine, Blender, Runway, Sora and similar programs.
Acting and performance
AI can and is already producing synthetic actors to create digital doubles and AI-generated crowds.
  • Background artists are already displaced due to AI generated crowds and extras in a limited manner. This displacement of extras, crowd performers and minor background roles is expected to increase.
  • Voice acting is severely threatened as synthetic voices are increasingly now near indistinguishable from real human voices and can be used for minor characters, video games, commericals and dubbing. Studios can licence a voice and use it indefinately.
  • AI tools are Nvidia and Runway AI
Writing
AI can already operate to develop plot structure, dialogue drafts, storyboarding, episode outlines, alternate scene ideas. Writers room teams that once had 6-12 junior writers now only require a headwriter, 2-3 senior writers and AI-assisted drafting tools. A showrunner with AI-assistance may need only 2-3 senior writers rather than a full room.

Localisation and dubbing is already occuring using AI replacing human translators and lip sync dubbing artists at scale.

The reality is that with time and patience, AI will enable very small teams to produce cinema-quality films. Early versions of AI films can already be found on YouTube however many of these projects suffer from continuity failures and many technical deficiencies in storytelling structure.

The safest roles in the AI-era are those positions with creative authority, not basic production. Examples could be roles such as showrunner, art director, creative director, lead animator, production designer. These are decision-making roles and decide what should exist rather than merely producing it.

Saturday, 11 April 2026

Artificial intelligence Part 2: impact on the structure of employment and reduction of entry level roles

ChatGPT image
As artificial intelligence (AI) continues to be developed and implemented in various forms across workplaces, the exact impact for employment is becoming apparent even in this early stage of adoption. When discussing AI, it's important and extremely relevant to define the capabilities of AI.

AI carries out three activities across all industries -
  1. automates the repetitive layer
  2. compresses the workforce pyramid
  3. raises the value of senior decision-makers (to an extent)
An example only to demonstrate this impact is the organisational structure in industry. 
An industry that once had this structure -
  • 1 Director
  • 3 Senior professionals
  • 15 junior staff
Under AI capability becomes a structure with -
  • 1 Director
  • 3 Senior professionals
  • 3-5 AI-assisted operators
AI across many white collar industries removes what is called the "first draft economy''. Many jobs existed primarily to produce first drafts of various outputs such as reports, media releases, policy notes, research documents, scripts, designs, code for information technology. AI can now produce much of this instantaneously.

AI is starting to hollow-out the traditional 'career ladder'. The junior roles that people once used to enter professions are disappearing first. This situation does have long term consequences for how expertise and experience is developed in society. This creates a "pipeline problem" and is becoming one of the largest and dominant structural challenges of implementing AI.

AI does compress some organisational hierarchies and enables an increase in the number of people or functions that a single leader can manage. This is known as the 'span of control' which AI increases while reducing certain managemernt layers in organisations. Hierarchical compression is only one aspect of the AI's impact but equally the very shape of organisations also changes with -
  • fewer administrative workers
  • fewer reporting layers
  • smaller teams with higher productivity
  • leaders responsible for larger spans of activity
Roles that involve accountability, legal responsibility or political authority will remain human dominated. AI does reduce the documentation workforce that produces reports, compiles data, drafts documents and summarises information. It does not replace roles that have decision authority, physical presence, strategic judgement and/or legal accountability. 

As another example of structural change, before AI implementation, a very large organisation often had this structure -
  • Executive leadership
  • Senior managers
  • Middle managers
  • Supervisors/team leaders
  • Large operational workforce
After AI implementation, the organisation could be structured as -
  • Executive leadership
  • Senior specialists
  • Fewer managers
  • AI-enabled reduced operational staff
Effectively the middle and bottom tiers shrink.

The multi-part series covering AI, published in this blog, has been researched and compiled using Claude ai (Anthropic), ChatGPT (OpenAI), and Grok (Xai). Later posts on this topic will list specific industries where change is already happening.

Wednesday, 8 April 2026

Health and coffee

Does coffee raise your blood pressure? Here’s how much it’s OK to drink

Olga Pankova/Getty Images
Clare Collins, University of Newcastle

Coffee first entered human lives and veins over 600 years ago.

Now we consume an average of almost two kilos per person each year – sometimes with very specific preferences about blends and preparation methods. How much you drink is influenced by genes acting on your brain’s reward system and caffeine metabolism.

Coffee can raise your blood pressure in the short term, especially if you don’t usually drink it or if you already have high blood pressure.

But this doesn’t mean you need to cut out coffee if you have high blood pressure or are concerned about your heart health. Moderation is key.

So how does coffee affect your blood pressure? And if yours is high, how much is OK to drink?

What is high blood pressure?

Blood pressure is the force blood exerts on artery walls when your heart pumps. It’s measured by two numbers:

  • the first and biggest number is systolic blood pressure, which is the force generated when your heart contracts and pushes blood out around your body

  • the lower number, diastolic blood pressure, is the force when your heart relaxes and fills back up with blood.

Normal blood pressure is defined as systolic blood pressure of less than 120 millimeters of mercury (mm Hg) and diastolic blood pressure of less than 80 mm Hg.

Once your numbers consistently reach 140/90 or more, blood pressure is considered high. This is also called hypertension.

Knowing your blood pressure numbers is important because hypertension doesn’t have any symptoms. When it goes untreated, or isn’t well-controlled, your risk of heart attacks and strokes increases, and existing kidney and heart disease worsens.

About 31% of adults have hypertension with half unaware they have it. Of those taking medication for hypertension, about 47% don’t have it well-controlled.

How does coffee affect blood pressure?

Caffeine in coffee is a muscle stimulant that increases the heart rate in some people. This can potentially contribute to an irregular heartbeat, known as arrhythmia.

Caffeine also stimulates adrenal glands to release adrenaline. This makes your heart beat faster and your blood vessels to constrict, which increases blood pressure.

Blood caffeine levels peak between 30 minutes and two hours after a cup of coffee. Caffeine’s half-life is 3–6 hours, meaning blood levels will reduce by about half during this time.

The range is due to age (kids have smaller, less mature livers so can’t metabolise it as fast), genetics (people can be fast or slow metabolisers) and whether you usually drink it (regular consumers clear it faster).

The impact of caffeine on blood pressure from coffee (and cola, energy drinks and chocolate) varies. Research reviews report increases in systolic blood pressure of 3–15 and a diastolic blood pressure increase of 4–13 after consumption.

The effect of caffeine also depends on a person’s usual blood pressure. An increase in blood pressure may be more risky if you have hypertension and existing heart or liver disease, so it’s best to discuss your coffee consumption with your doctor.

What else is in coffee?

Coffee contains hundreds of phytochemicals: compounds that contribute flavour, aroma, or influence health and disease.

Phytochemicals that directly affect blood pressure include melanoidins, which regulate the body’s fluid volume and activity of enzymes that help control blood pressure.

Quinic acid is another phytochemical shown to lower systolic and diastolic blood pressure by improving the lining of blood vessels, allowing them to better accommodate blood pressure rises.

Can coffee cause hypertension?

In a review of 13 studies that included 315,000 people, researchers examined associations between coffee intake and the risk of hypertension.

During study follow-up periods, 64,650 people developed hypertension, with the researchers concluding coffee drinking was not associated with an increased risk of developing the condition.

Even when they examined data by gender, amount of coffee, decaffeinated versus caffeinated, smoking or years of follow-up, coffee was still not associated with an increased risk of developing hypertension.

The only exceptions suggesting lower risk were for five studies from the United States and seven low-quality studies, meaning those results should be interpreted with caution.

A separate Japanese study followed more than 18,000 adults aged 40–79 years for 18.9 years. This included about 1,800 people who had very high blood pressure (grade 2-3 hypertension), with systolic blood pressure of 160 or above or diastolic blood pressure of 100 or above.

Here, risk of dying from cardiovascular disease, including heart attack or stroke, was double among those drinking two or more cups of coffee a day compared to non-drinkers.

There were no associations with death from cardiovascular disease for those who had either normal blood pressure or mild (grade 1) hypertension (systolic blood pressure 140–159 or diastolic blood pressure 90–99).

The bottom line

There is no need to give up coffee. Here’s what to do instead:

  1. know your blood pressure, health history and which food and drinks contain caffeine

  2. consider all factors that influence your blood pressure and health – family history, diet, salt and physical activity – so you can make informed decisions about what you consume and how much you move

  3. be aware of how caffeine affects you and avoid it before having your blood pressure measured

  4. avoid caffeine in the afternoon so it doesn’t affect your sleep

  5. aim to moderate your coffee intake by drinking four cups or less a day or switching to decaf

  6. if you have systolic blood pressure of 160 or above or diastolic blood pressure of 100 or above, consider limiting to one cup a day, and talk to you doctor. The Conversation

Clare Collins, Laureate Professor in Nutrition and Dietetics, University of Newcastle

This article is republished from The Conversation under a Creative Commons license. Read the original article.

Tuesday, 7 April 2026

Artificial intelligence: fast html code of comets example

The example above is html coding done by AI in 1 second to show comets crossing the sky.

Friday, 3 April 2026

Environment - Microplastics have been located at every level of the world's oceans

ChatGPT image
Microplastics have been found throughout the world's oceans and at all levels of the water column following a comprehensive survey of over 1,885 sites across the planet. The survey conducted by researchers from Japan, China, New Zealand, Italy, the Netherlands and the United States located microplastics across depths in the ocean including the deepest parts. The Mariana Trench, for example, recorded more than 13,000 microplastic particles per cubic metre nearly 7 miles down. 

Of particular concern from the findings, is that the smallest particles were distributed almost evenly throughout the water currents, rather than being more at the surface level than at the bottom of the ocean.  Another key finding from the survey measurements is that the polymers in these plastics were accounting for very strong reading of the carbon in the water. At depths of 2,000 metres, the polymers comprise as much as 5 per cent of the carbon. 



These high carbon levels may reduce the capacity of oceans to aborb carbon dioxide from the atmosphere and thus enable global warming.

The full report can be accessed here: Microplastics in the ocean

Artificial intelligence: A glowing sun: coded by AI in 0.5s

Thursday, 2 April 2026

Easter 2026 - customs

AI generated image - ChatGPT
The Easter period for Christians celebrates the resurrection of Jesus from the dead, a chief tenet of their faith. It's a period of holidays, religious practices, rituals and the consumption of specific food such as hot cross buns and chocolate eggs (or chocolate rabbits/bunnies is another popular practice).  

For the religious faithful, the concept of resurrection is one where, through the faith in God, followers of Jesus are resurrected spiritually and walk a new existence through eternal salvation and dwell in the Kingdom of Heaven.

The custom of Easter eggs is a symbol of life and rebirth and connected with the empty tomb upon the resurrection of Jesus. Eggs were previously chicken eggs dyed in different colours, but in recent decades took on a sweet form through the use of chocolate. In orthodox religions, dyed eggs are still the common practice.

The hot cross bun or spiced bun is made with fruit marked with a cross on the top. Traditionally eaten on Good Friday in the Christian calendar, the bun marks the commencement of the season of Lent. Parts of the bun have different meanings however the cross on the top is umistakeable as the crucifixion of Jesus. The bun has a long history stretching back to the 6th Century with variations occuring in the centuries thereafter.

May the period of Easter be one of reflection and celebration in a conflicted world.

Tuesday, 24 March 2026

Climate change - the world continues to heat up

 

The latest world climate report is grim, but it’s not the end of the story

Andrew King, The University of Melbourne

It’s no secret our planet is heating up.

And here’s the evidence: we’ve just experienced the 11 hottest years on record, with 2025 being the second or third warmest in global history.

The annual State of the Climate report, published today by the World Meteorological Organization, suggests we’re still too reliant on fossil fuels. And that’s pushing us further from our goal to decarbonise.

So what is happening to our climate? And how should we respond?

The climate picture

Unfortunately, the most recent climate data makes for grim reading.

Let’s look back at 2025, through the lens of four climate change indicators.

Carbon dioxide

We now have a record amount of carbon dioxide in the atmosphere, about 50% higher than pre-industrial levels. And we’re still emitting large amounts of carbon dioxide through our use of fossil fuels. In 2025, global emissions reached record high levels. The carbon dioxide we emit can stay in the atmosphere for a long time. So each year we keep emitting large amounts of carbon dioxide, the more concentrated it will be in our atmosphere.

Temperature

In 2025, the world experienced its second or third warmest year on record, depending on which dataset you use. The average temperature was about 1.43°C above the pre-industrial average.

This is particularly unusual given we observed slight La Niña conditions in the Pacific region. La Niña is a type of climate pattern characterised by temperature changes in the Pacific Ocean. It typically creates milder, wetter conditions in Australia and has a cooling effect on the global average temperature. But even with La Niña conditions, the planet stayed exceptionally hot.

And each of the last 11 years were hotter than any of the previous years in the global temperature series. This is true across all the different datasets used in the report. However, this does not mean a new record was set each year.

Oceans and ice

In 2025, the heat held within the world’s oceans reached a record high. And as our oceans continue to warm, sea levels will also rise. Hotter oceans also speed up the process of acidification, where oceans absorb an increased amount of carbon dioxide with potentially devastating consequences for some marine animals.

The amount of Arctic and Antarctic ice is also well below average. This report shows sea ice extent, a measure of how much ocean is covered by at least some sea ice, is at or close to record low levels in the Arctic. Meanwhile, the amount of ice stored in glaciers has also significantly decreased.

Extreme weather

Research shows many of the most devastating extreme weather events of 2025 were exacerbated by human-driven climate change. The heatwaves in Central Asia, wildfires in East Asia and Hurricane Melissa in the Carribean are just three examples. Through attribution analysis, which is how scientists determine the causes of an extreme weather or climate event, this report highlights how our greenhouse gas emissions are making severe weather events more common and intense.

How does Australia stack up?

Compared to most other countries, Australia has a disproportionate impact on the global climate.

This is largely because our per capita carbon dioxide emissions are about three times the global average. That means on average, each of us emits more carbon dioxide than people in all European countries and the US.

Emissions matter because they exacerbate the greenhouse effect. That is the process by which greenhouse gases, such as carbon dioxide and methane, trap heat near Earth’s surface. So by emitting more greenhouse gases, we contribute to global warming. And research suggests Earth is warming twice as fast today, compared to previous decades.

However, Australia is also experiencing first-hand the adverse effects of human-induced climate change.

In 2025, we lived through our fourth-warmest year on record. The annual surface temperatures of the seas around Australia reached historic highs, beating the record temperatures set in 2024. And last March was the hottest March we’ve seen across the continent.

Here in Australia, we are also battling longer and hotter heatwaves and bushfire seasons. And scientists warn these extreme weather events will only become more common.

The Bureau of Meteorology’s annual summary highlights how Australia’s climate is changing.

So what can we do?

The 2025 State of the Climate Report shows how much, and how quickly, we are changing our climate. And it is worryingly similar to previous reports, highlighting the need for urgent action.

The priority should be decreasing our emissions. This would slow down global warming, which will only continue if we keep the status quo. Some countries are already decarbonising rapidly, in part through transitioning to renewable electricity supplies. Others, including Australia, need to move much faster to reduce emissions.

Crucially, we must also meet our net zero targets. In Australia, as in many other countries, we are aiming to reach net zero by 2050. The sooner we reach net zero, the more likely we are to avoid harmful climate change impacts in future. To achieve net zero, we need to significantly reduce our emissions while also increasing how much carbon we remove from the atmosphere.

Even if we meet our net zero targets, climate change will not magically disappear. However, by turning away from fossil fuels and cutting our greenhouse gas emissions now, we may spare future generations from its worst effects. That’s the least we can do.The Conversation

Andrew King, ARC Future Fellow and Associate Professor in Climate Science, ARC Centre of Excellence for 21st Century Weather, The University of Melbourne

This article is republished from The Conversation under a Creative Commons license. Read the original article.

Sunday, 22 March 2026

Artificial intelligence - graphic design examples

 
AI generated image - ChatGPT
One of the most immediately impacted industries from artificial intelligence (AI) is graphic design. Some images can appear as artistic creations (as shown above). Yet others can be created with a realistic appearance which is increasingly hard to detect as artificial (as shown below). Images can take only seconds to create and can be easily adjusted and edited.  

AI generated image - ChatGPT

Artificial intelligence Part 1: restructuring the workforce - what does AI do ?

AI generated image - Chat GPT
Media reports, opinion editorials and speculation by public commentators about artifical intelligence (AI) have been fuelling considerable instability for the sharemarket listed ICT sector in major economies as well as concerns in the workforce regarding the actual impact of potential employment losses. The reality is that the impact of  AI is not well understood or clearly defined as it is an emerging technology with the full ramifications yet to be fully measured. Most job losses and employment reductions have occured in information technology companies predominantly in the software developmnent and business support teams. This however does not represent the true extent of transformation that is coming.

A key feature of the articles and reports to date has been the under representation of actual impact as published in the media. The effect of AI has essentially been over emphasised in the technology sector and underplayed in the rest of the economy. Essentially AI will impact white collar occupations the most and be more far reaching than has been thus far reported.

Current state of play
AI is created using large language-based models and works best for rules-based, screen-based work with set parameters. Workplace transformation is already occuring in -
  • Knowledge work (particularly entry level)
- Administrative assistants.
- Data entry clerks
- Paralegals doing document reviews
- Junior accountants
- Basic market researchers
  • Content production (that is essentially formulaic)
- Copywriters for generic marketing
- SEO (search Engine Optimisers) article writers
- Basic graphic production
- Translation of common languages

These roles are not eliminated but fewer staff are needed as productivity rises.
  • Software roles (with a focus on junior roles)
- Junior coders
- QA testers
- Routine debugging of software

Senior engineers remain current however the career ladder below them is compressed and positions are reduced.
  • Customer interaction roles (accelerating an existing trend)
- call centre agents
- Tier-1 technology support
- Scheduling and booking staff

These roles can be reduced or be removed by use of Chatbots and AI voice agents.

This blog will be publishing a series of posts on the use of AI and its developing and continuing effect on the workforce and the economy. 

Sunday, 15 March 2026

Artificial intelligence - the fourth industrial revolution

Sunrise over the earth from space  AI created
The advent of artificial intelligence (AI) heralds the fourth industrial revolution, building on the three previous technology changes in the past. The AI revolution, as so termed, will bring with it a very strong reality of causing genuine employment reductions (not redeployment) and hence social dislocation. Occupations will be replaced and workforce reductions can and will occur often at lightening speed.

So what were the previous industrial revolutions ?

1st industrial revolution: essentially mechanisation of sorts such as water power, steam power and a move away from agrarian economies to mechanisation.

2nd industrial revolution: this was the era of electrification and new power sources. In turn this enabled new advances in mehanisation, the advent of the assembly line. Mass production became possible in both consumer goods and business-to-business methods such as machine tools.

3rd industrial revolution: the information economy and the internet. Computers, semiconductors and the use of automation and early stage robotics. The move from analog to digital also comes into this era.

And now the 4th industrial revolution which has heralded artifical intelligence, machine learning, quantum computing, biotechnology advances and connectivity between physical, digital and biological systems.

Why the 4th industrial revolution is so significant is the very characterisation of AI itself. The systems learn and improve on their own, make decisions and automate cognitive work. This a paradigm shift in reality and one where the end point and absolute objectives are not at all clear.

Tuesday, 10 March 2026

Future age weapons now a reality

 

Israel’s ‘Iron Beam’: why laser weapons are no longer science fiction

Rafael Advanced Defense Systems
James Dwyer, University of Tasmania

As conflict escalates following the US and Israeli attacks on Iran, and Iran’s subsequent retaliatory strikes, reports have emerged that Israel may have used laser weapons to shoot down rockets fired by Hezbollah from Lebanon.

While the reports are unconfirmed, video circulating on social media appears to show rockets being destroyed within moments of launching without visible intervention – consistent with the effect of a “directed energy weapon” such as a laser.

It wouldn’t be the first time Israel has used its cutting-edge Iron Beam laser air defence system, but the incident offers a glimpse into a changing landscape where high-tech militaries are scrambling to keep up with barrages of small rockets and cheap, increasingly capable drones.

What is Iron Beam?

Most defensive systems use rocket-propelled missiles against incoming threats. Iron Beam, however, uses a laser – also known as a directed energy weapon.

Where a missile destroys a drone, shell or rocket by crashing into it or exploding near it, Iron Beam destroys targets by burning them with an extremely powerful laser.

Manufactured by Rafael Advanced Defense Systems, which “serves as Israel’s High-Energy Laser National Center for Excellence and National Lethality Lab”, a smaller version of Iron Beam was first successfully tested in 2022. The system was first used in practice last year, to shoot down drones launched by Hezbollah.

Using a 100 kilowatt solid state laser mounted on a mobile trailer, Iron Beam can be strategically deployed and moved depending on the current threat vector, and adds an additional layer of defence to Israel’s existing, layered defensive systems.

How is it different to the Iron Dome, David’s Sling and Arrow air defences?

The biggest advantage of laser weapons over missiles is cost. A single Iron Dome interceptor missile costs about US$50,000 – which means the costs add up quickly when defending against large or frequent attacks.

Firing the Iron Beam laser costs a lot less. In 2022, Israel’s then prime minister Naftali Bennett said each shot cost around $US3.50, and more recent estimates suggest the cost may now have fallen as low as US$2.50 per shot.

Black and white photo of a laser beam hitting a small object
An infrared image of high-energy laser test targeting a drone. Office of Naval Research / Lockheed Martin

The economics alone present a powerful motivator for militaries to develop and deploy these weapons.

Another significant advantage of Iron Beam and other directed energy weapons is that they don’t run out of ammunition. Whereas a missile battery needs to be reloaded after use, an energy weapons just needs power.

The only limiting factor for the number of shots is overheating due to the huge amounts of energy expended. Eventually a laser weapon needs to stop firing to cool down, or it will be damaged by the heat.

There’s little public information on how many shots these weapons can fire or at what rate before overheating, but it is widely assumed they can still easily outfire most conventional munitions.

Of course, Iron Beam doesn’t operate in isolation: Israel still possesses its other defensive capabilities. The cheaper Iron Beam can be used first, then backed up with other systems if needed.

The other limitation for directed energy weapons is range. They can’t reach as far as missiles such as David’s Sling or Arrow, so they are only useful for countering drones, artillery and short-range missiles.

Directed energy weapons on the ground can’t reach high-flying long-range ballistic missiles. What’s more, they are less effective in rainy, damp or cloudy conditions.

What role is Iron Beam playing in the current conflict?

Iron Beam (and other directed energy weapons being developed and deployed by other countries) are not intended to replace existing defensive systems, but to supplement them. The radically lower cost per shot provides far greater flexibility to counter “low cost” threats such as one-way drones or artillery shells.

In last year’s conflict with Iran, the United States, United Kingdom and Israel rapidly discovered they were expending large numbers of extremely expensive missiles to counter relatively cheap Iranian missiles, rockets and drones.

The US has responded with a crash course program to quickly arm its fighter jets with larger numbers of cheaper anti-drone rockets.

Directed energy weapons offer many of the same (if not greater) benefits for ground and naval-based defences.

Both the US and Israel reportedly expended a large proportion of their defensive missiles during the last conflict with Iran in 2025. Using directed energy weapons can also help preserve stores of these munitions.

Missile stockpiles are not easily replenished quickly. Even then, a large or sustained attack would quickly deplete them again.

An option that provides defence against shorter-range or slower threats allows the more expensive missiles to be held in reserve.

Where to from here?

War lasers may still sound like science fiction. But Israel is far from alone in developing and deploying them.

The US has tested laser drone and missile defences on navy ships. Both China and Japan have also tested naval and ground-based directed energy weapons.

For naval vessels in particular the benefits of directed energy weapons are immense. Reloading defensive missiles at sea is difficult, or often impossible, requiring a return to port.

In a high-intensity conflict (or a lower-intensity but prolonged conflict) this can present a significant challenge. It can also leave vessels vulnerable when they have depleted their missile stores, or are in port to rearm.

Running out of munitions is often a significant concern for defensive systems. Directed energy weapons lessen this worry – so we are likely to see them more and more as technology develops.The Conversation

James Dwyer, Lecturer, School of Social Sciences, University of Tasmania

This article is republished from The Conversation under a Creative Commons license. Read the original article.