AI and the 2025 Hiring Freeze

Standard


2025 Hiring Freeze

In 2025, several prominent tech companies, including Salesforce, Microsoft, and Meta, have implemented hiring freezes and workforce reductions, primarily influenced by advancements in artificial intelligence (AI). These measures reflect a broader industry trend toward automation and efficiency.

Salesforce’s Strategic Shift

Salesforce CEO announced that the company might not hire any new software engineers in 2025. This decision stems from significant productivity gains achieved through AI agents working alongside engineers. Benioff stated, “We have seen such incredible productivity gains because of the agents that work side by side with our engineers.”

Furthermore, Salesforce is reportedly planning to cut around 1,000 roles while hiring sales staff to bolster its AI initiatives. This move aligns with the company’s strategy to integrate AI into its operations, aiming to enhance efficiency and reduce costs.

Salesforce’s actions are part of a larger pattern within the tech industry. Companies like Google and Meta are also adopting AI to streamline operations. Google CEO Sundar Pichai revealed that AI now writes over 25% of new code at Google, significantly reducing human coding work. Meta CEO Mark Zuckerberg has stated that AI could soon replace midlevel software engineers, roles that typically command six-figure salaries.

This trend is raising concerns about a “white-collar recession” in 2025, as AI-driven automation may accelerate job displacement in the tech sector. Industry analysts warn that AI’s growing role could contribute to significant job losses among software engineers and related professionals.

Broader Impact Across Industries

The influence of AI extends beyond tech companies. In 2025, multiple major companies across various industries have announced layoffs. As much as 41% of global companies are planning workforce reductions over the next five years due to AI advancements. Notable companies implementing job cuts include Adidas, Ally Bank, BlackRock, and Wayfair.


41% of surveyed employers forsee staff reductions due to skills obsolescence

Future of Jobs, World Economic Forum

The integration of AI into business operations is reshaping the employment landscape, particularly in the tech industry. Companies like Salesforce, Microsoft, and Meta are leveraging AI to enhance productivity, leading to hiring freezes and workforce reductions. AI offers significant efficiency gains. However, it also poses challenges for employment. Businesses must reevaluate workforce strategies. New skill sets are needed to adapt to the evolving job market.

Realization and Justification

As AI continues to automate coding and software development tasks, companies and managers will use several narratives to justify hiring freezes and the reduction of software developers.

Continue reading

Satellite Missions for Atmospheric Composition and Air Quality Monitoring

Standard

Earth’s system is a coupled system where the different components (the atmosphere, the hydrosphere, the lithosphere, and the croyosphre) are constantly interacting at different spatial and time scales.

Atmospheric composition and thus air quality (what impacts human health as well as other lifeforms inluding agriculture) is a hot topic and active area of research. When it comes to air quality, measurements of pollutant concentrations like particulate matter and trace gases is not new and continues to develop and expand as technology keeps improving and measurement networks keeps growing.

However, like in atmospheric science and hydrology point based measurements can only do so much. For example, determining the water level or water speed at a point along a river stream requires an understanding of the surface elevation (known bathymetry when under water) and whether the point of interest is upstream or downstream the river. Similarly, how the concentration of a pollutant evolves is governed by the underlying driving forces most notably the wind field, the pollutant lifetime, and the nature of the chemical interactions that arise.

It is therefore critical that the evolution of different chemical elements (pollutants if they affect human life) are tracked and studied in detail. Some of the questions science is trying to answer when it comes to air quality include:

  • What are the spatial and temporal variations of the concentrations of pollutants?
  • How are local and regional air quality affected by long-range transport?
  • How does air quality and climate change drive each other?
  • How is air quality affected by metrology and how are pollutants dispersed by weather?
  • How can fluxes between different regions be quantified or estimated?

For these and other questions to be answered monitoring is required at the appropriate scale and temporal frequency of the underlying phenomena. Here then the role of satellite-based monitoring comes into play. A number of organisations have teamed up towards the same goal of making this monitoring a reality. A “virtual” constellation of satellites will be composed of three missions that will monitor air quality from space at unprecedented quality.

Continue reading

Quantum Harmonic Oscillator: Power series method in Maple

Standard

In the previous blog post What is Computational Physics (Science)?, I ended the post with the following figure

Graph of the probability distribution of the 100th state of the quantum
harmonic oscillator (generated using the power series method).

and stated that I might write a post on how to solve the Quantum harmonic oscillator numerically using the power series method (the other method being the ladder operator method [1]) and generate that figure. This post is just about that.

Ok. First I need to clear the cache with the restart command, import the PDEtools (to solve the pde SE) and Maplets[Elements] (necessary if you want to generate a maplet with a slider) packages.

restart;
with(PDEtools): #we need to use the dchange command later in the solution
with(Maplets[Elements]):

What is Computational Physics (Science)?

Standard

As a senior physics undergraduate I have come to believe that scientific computation must be part of the physics curriculum. It is true that physics students are required to study and master many topics, languages, techniques, and skills like mathematics, linguistics, & science communication, still I think that computational physics should be a major part of the curriculum. It is not logical to be in the age of supercomputers and the physics curriculum remain bound to pen and paper as it used to be before the advent of computers! I am not suggesting that physics should all be done on computers; absolutely not. The student must acquire the necessary theoretical and mathematical concepts and skills, besides the physics thinking, before delving in computational physics! What use would a computer have if its user doesn’t know what he wants to use it for? In other words, how would a physics student who hasn’t studied classical mechanics be able to solve a classical mechanics problem on a computer? He will surely not be able to do so, since he will not be able to appropriately instruct the computer due to his lack of conceptual physics and paper & pen problem solving skills. In short, “a computer is as dumb as its user is dump, and a computer is as smart as a smart user; the smarter and knowledgeable the user, the more productive and efficient the computer is”!

The computer is a little over 70 years old. The first computer, many articles & resources claim, is the “Electronic Numerical Integrator And Computer“, or ENIAC for short, which is not technically correct. Many other computers preceded ENIAC most of which were developed for military purposes (e.g; calculation of artillery, cryptoanalysis, etc…) and were analogue (or electro-mechanical) & programmed by punched cards. ENIAC was a room-sized computer that required several people to operate by turning on/off switches that made use of vacuum tubes the ancestor of the modern transistor.

One particularly interesting electromechanical machine (could be called a computer) was the “bombe” [1] which was [designed] by the mathematician Alan Turing to be used to crack the Enigma, the code used by the Nazi to encrypt messages.

Working rebuilt bombe at Bletchley Park [2].
Interior of the rebuilt bombe at Bletchley Park.
The bombe was in part successful in breaking the Enigma. Moreover, Alan Turing has impacted the modern day internet as well; everyone of us using the internet have definitely faced the “CAPTCHA” which are used to counter-bots & make sure the user is an actual human being & not a bot (from robot). CAPTCHA is an abbreviation for “Completely Automated Public Turing test to tell Computers and Humans Apart”. And yes, Turing in CAPTCHA is the same as Turing the mathematician of the 1940’s, though the original Turing test was a human against a machine test not the other way round!

Continue reading