footrpint
typo
footrpint
typo
The GLEIF also produces an official mapping file, linking LEI records to the corresponding OpenCorporates records for the same legal entities (and this provides a route from users of OpenCorporates records to BIC and ISIN identifiers, which are also mapped to the LEI). This relationship is underpinned by both organisations having a data model where one-legal-entity is represented by one and only one record; where both organisations have public benefit missions; and where both are committed to non-proprietary identifiers.
Oh neat - so this presumably means if you have an LEI, then you also can find the correspponding OpenCorporates ID, which would let you find the corporate grouping if it exists.
Instituting nondiscrimination or common carrier obligations on compute providers operating key points of the stack
This is a bit like Net Neutrality, but for compute
The FTC has already outlined this principle in its recent Amazon Alexa case
Reference this, it’s an interesting precedent
The UK’s Competition and Markets Authority recently published its initial report on AI Foundation Models, which will be followed up with a subsequent workstream in early 2024
Is it out yet?
Across the Atlantic, compute power has been an important element in France’s national interest in building out its AI capabilities. Among other moves, France funded the creation of the Jean Zay supercomputer in 2019, operated by the Centre national de la recherche scientifique (CNRS). The computer was used to train BigScience’s BLOOM large-scale AI model
Ah, so THATs where BLOOM was created. Weld this have played a role in the transparency too?
Over 50 new semiconductor projects were announced worth $200 billion following the passage of the Act, according to the Semiconductor Industry Association.162 Among them is TSMC, which plans to make a $40 billion investment in a new facility in Phoenix, Arizona.163 This is particularly notable because it illustrates that market subsidies can function to exacerbate rather than ameliorate market concentration if not carefully calibrated: given the existing bottlenecks in chip fabrication, such investments can easily be captured by dominant players even if they introduce more geographical spread.164 Notably, the chips produced within TSMC’s new facility will still be sent back to Taiwan for packaging and assembly, subverting the hope of creating fully made-in-the-USA chips.165
Do the European deals have similar issues with maintaining the same bottlenecks?
A 25 percent investment tax credit for semiconductor manufacturing and related equipment
There’s an ITC for manufacture now
There are a few other pathways toward overcoming the compute barrier, significant among which is a degree of decentralized training. Federated learning could possibly be a way to achieve scale without centralization
Ask Philipp about flower and how they get around the interconnect issue
For instance, Hugging Face has partnered with AWS in a revenue-sharing agreement to allow developers using Hugging Face to use AWS’s compute and software
Ahhh, I hadn’t realised the kickback here. It’s now much clearer to me how they make cash now
Of particular note is that although TSMC is building a chip fabrication facility in Arizona under the US CHIPS Act, all of the chips fabricated at this facility will need to be sent to Taiwan for packaging, meaning that TMSC’s US chip production process will still require a global supply chain network
Ah… this is what packaging was referring to before
But this software dominance is also slowly being challenged. OpenAI developed Triton, an open-source software solution that it claims is more efficient than CUDA. Triton can only be used on Nvidia’s GPUs as of now.97 Meta developed PyTorch and then spun off the project as an open-source initiative housed under the Linux Foundation (still financially supported by Meta), and its new version performs relatively well on Nvidia’s A100.98 The benefit of PyTorch is that it can be used across a range of hardware, but on the flip side, it is not optimized for any particular chip
Ah… so THATs what purpose PyTirch serves. PyTorch is to CUDA what OCP is to proprietary hyper scale server design
Cerebras differentiates itself by creating a large wafer with logic, memory, and interconnect all on-chip. This leads to a bandwidth that is 10,000 times more than the A100. However, this system costs $2–3 million as compared to $10,000 for the A100, and is only available in a set of 15. Having said that, it is likely that Cerebras is cost efficient for makers of large-scale AI models
Does this help get around the need for interconnect enough to avoid needing such large hyper scale buildings?
Moreover, its proprietary CUDA compiling software is the most well known to AI developers, which further encourages the use of Nvidia hardware as other chips require either more extensive programming or more specialized knowledge.
It’s good that this is so explicitly called out as a bottleneck
While FLOP/s grew by more than 6 times between Nvidia’s A100 chips and its latest H100 chips, memory bandwidth (interconnect) only grew by 1.65 times.82 Apart from practical and technological constraints, there is an energy cost to memory bandwidth, with a significant portion of a chip’s energy usage being attributed to interconnect. Overall, this means that interconnect is an important current constraint to the growth of computational power
This is nicely demonstrated by that recent off package /on package power draw differential shown at hot chips 2024
An important technological constraint with current memory technology is that while logic has only one goal to optimize for (maximizing the number of transistors on a chip), memory is trying to optimize for multiple goals (capacity, bandwidth, and latency).75 Latency has usually lagged behind the other two
This is a really nice insight, presented succinctly
This translates directly into cost increases: SOTA AI chips are 10–1,000 times more cost-effective than SOTA CPUs, and 33 times more cost-effective than trailing node AI chips. Thus a large AI model built on trailing node AI chips would be at least 33 times more expensive than models using leading node AI chips.61
This is a really good example of demonstrating why people invest in new hardware
in the 1990s and 2000s, the US government reduced its level of investment as interventionist policy fell out of vogue. This set the stage for Intel’s decline relative to firms like Taiwan Semiconductor Manufacturing Company, now the world’s dominant chip fabricator; and ASML, a Dutch company that is the sole manufacturer of the equipment needed to build state-of-the-art chips.
I hadn’t realised this significance of the loss of subsidy in offshoring
This trend has borne out historically: before the deep learning era, the amount of compute used by AI models doubled in about 21.3 months; since deep learning as a paradigm took hold around 2010, the amount of compute used by models started doubling in only 5.7 months12. Since 2015 however, trends in compute growth have split into two: the amount of compute used in large-scale models has been doubling in roughly 9.9 months, while the amount of compute used in regular-scale models has been doubling in only about 5.7 months
If something is doubling faster in small models, how long before they Ive take the larger models? I can’t do the maths in my head
Third, relatedly, we need to excite people with big ideas that are congruent with the crisis, and that simultaneously speak to people’s deep economic and employment anxieties and the cost of living crisis.We need billions of dollars more spent on transformative climate infrastructure that will employ tens of thousands of people.Rather than trying to incentivize heat pumps with inadequate rebates, let’s just make them free! (As PEI does for households with incomes under $100,000.)Let’s talk about free public transit, and huge subsidies for e-bikes, to liberate people from punishing transportation expenses. And let’s propose paying for a chunk of all that with wealth and windfall profits taxes (a recent Abacus survey found increasing taxes on the richest 1% to be a massive vote-winner), and suing the corporations that got us into this mess (as California is doing).These represent transformative policies that tackle multiple crises at once and bolster solidarity.
Wow. Bold. I love the sound of this, and yet I’m reflexively hearing myself up for the “how do we pay for it” response
Who will pick up the $280 billion bill? So far, it is the U.S. public. As I argued last week, we already effectively own these liabilities. So, how do we get the money to pay for them?
Wow, this is the cost of cleanup? how does this change the cost of a well if you have to pay a reasonable cost for the clean up at the end of the life of the well?
“It also involves cleaning up more than 100 years of industrial activity and if you look at all these producing fields in the Weald and in the east midlands, it’s a bit like the North Sea.
wait. The UK used to produce oi?
Utilization effectiveness
This is a new term to me. I think it's bit like PUE in that you want the number to be as close to 1 as possible (i.e. for 1 server's worth of utilisation, you want to have provision 1 server, not 5)
So, I think this is the term you might use to talk about how a hyperscaler datacentre might be full of individually efficient servers, but only have 5% of them serving workloads - the rest is just spare capacity ready to sell, or give away as credits
Aurora Serverless packs a number of database instances onto a single physical machine (each isolated inside its own virtual machine using AWS’s Nitro Hypervisor). As these databases shrink and grow, resources like CPU and memory are reclaimed from shrinking workloads, pooled in the hypervisor, and given to growing workloads
Oh, wow, so the workload themselves are dynamically scaling up and down "vertically" as opposed to "horizontally" - I think this is a bit like dynamically changing the size of Docker containers that are running the databases while they're running
We weren’t satisfied that only a relatively small number of volumes and customers had better performance. We wanted to bring the benefits of SSDs to everyone. This is an area where scale makes things difficult. We had a large fleet of thousands of storage servers running millions of non-provisioned IOPS customer volumes. Some of those same volumes still exist today. It would be an expensive proposition to throw away all of that hardware and replace it
They upgraded all of the servers rather than replacing them. That must’ve been a lot of work
We didn’t have to worry much about the network getting in the way since end-to-end EBS latency was dominated by HDDs and measured in the 10s of milliseconds. Even our early data center networks were beefy enough to handle our user’s latency and throughput expectations. The addition of 10s of microseconds on the network was a small fraction of overall latency.
Disks were so slow that the network didn’t really matter, basicall
These platters have tracks that contain the data. Relative to the size of a track (<100 nanometers), there’s a large arm that swings back and forth to find the right track to read or write your data
Oh, so you're reading from a track like a record. It's much more like a record player than I thought
Clean Energy Marshall Plan has the makings of a compelling pitch to U.S. domestic audiences: investing in the clean energy transition abroad will benefit businesses and workers at home. Evidence of that effect is already easy to find. The clean investment boom is turning novel technologies into market mainstays: emerging technologies such as hydrogen power and carbon capture now each receive more investment than wind
This feels like an unfortunate example. Wind is one of the foundational technologies we would need for hydrogen to work, and CCS is still boondoggle
The Green Climate Fund, the sole multilateral public financial institution devoted to addressing climate change, could follow this approach, too. Almost 15 years after it was founded, the GCF has disbursed only 20 percent of the funding it has received
Mind blown.why?
To complement the Clean Energy Finance Authority, the tariff could be lowered in exchange for foreign procurement of clean energy technologies or of clean products made in the United States
So the CBAM as the stick, and the CEFA as the carrot
A carbon-based tariff, or a carbon border adjustment, should further motivate climate action by exempting countries that are hitting their nationally determined goals under the 2016 Paris climate agreement or those that fall below certain income and emission thresholds
Ah, they’ve explicitly said it here then.
To accomplish this, the United States must use expanded, stronger, and smarter trade authorities. For example, Washington should build into its tariffs on imported goods an assessment of how much carbon was used to produce them. Tariffs should be determined by the emission intensity of the trading partner’s entire industry, rather than company by company, to avoid “resource reshuffling,” whereby countries try to dodge penalties by limiting their exports to only products manufactured with clean energy instead of reducing their emissions overall. These tariffs should be aimed at all countries, but given its current production practices, China would be hit the hardest.
So a CBAM?
the term “rare-earth minerals” is a misnomer: these elements are abundant and geographically dispersed. Eighty percent of the world’s lithium reserves, 66 percent of its nickel reserves, and 50 percent of its copper reserves are in democracies. Eighty percent of oil reserves, by contrast, are in OPEC countries, nearly all of which are autocracies
Useful stat
Companies using these services cannot learn by using these digital technologies because they pay only for use, not for access to the intangibles on the cloud.
This really lays out why some models have the clause for not training your own model. it's to avoid the creation of 'property' that a customer no longer needs to rent
But the climate impact of data centres could be significantly worse than this. Because of the huge strain data centres are placing on power grids, EirGrid placed a de facto moratorium on new connections around Dublin, causing many to seek a connection to the natural gas network to generate electricity on-site.
Wow, it's in Ireland too?!
The taxonomy is where the rules and data definitions are organised. It is comprisedof a set of elements (i.e., Key Performance Indicatorsand narratives) and all the presentation,calculation and standard logic rules that are in effect. Once created, the XBRL taxonomy is made public as an open sourcefileon the internet. Then, for a specific firm, software can be used to create an XBRL instance (the report itself), containing the specific facts and figures for a certain period. The XBRL instance can be checked against the taxonomy by all parties (reporting entity, a regulator, or even the public) in order toguarantee its data qualityand reliability, as the taxonomy contains data quality checks that any XBRL engine canvalidate
This is actually a handy description
The disclosure required by paragraph 35 shall include the total energy consumption in MWh related to own operations disaggregated by: (a) total energy consumption from fossil sources ( 40 ) ; (b) total energy consumption from nuclear sources; (c) total energy consumption from renewable sources disaggregated by:
This is the disclosure requirement. It IS subject to materiality, so firms only need to report it if the figure is deemed material to their operations.
DPI can be seen as part of a broader effort to reinvent our relationship to the internet—and, more generally, our digital ecosystem. A large part of its normative appeal stems from the “P” in the acronym: the sense that core functionality on the internet (i.e., identity, payments, data exchange) should not merely serve private ends but rather be reimagined as a set of public goods
Who needs Venmo when you have this?
This is a good policy, as unbundled REC purchases have a bad reputation when they are used across regions and countries as a cheap substitute for actual investments. But here they are being purchased to match investments in new generating capacity.
This is an interesting framing. I haven't seen it present like this before. Is there a way to "re-bundle" them?
To use an extreme and blunt example, if an AI were tasked to stop global warming it might suggest to simply remove all the humans; that might get the job done (solve the task) but not in a way that is aligned with the intent (solve climate change while preserving human life).
Summarising the alignment problem
In this project, the data workers serve as community researchers. Every community researcher works for two to four months on their inquiry and is compensated for all their working hours. We collaborate with data workers globally. A decisive sampling criterion is that these are data workers who are already organized in workers councils, unions, communities, or advocacy organizations.
Coudl this be applied to sustainability working groups, Green Teams, and related ERGs?
On the matter of regulation and legislation, the UK needs to adopt similar legislation to that already in place in the EU namely, the Taxonomy Climate Delegated Act, including the Assessment Framework, the Energy Efficiency Directive and its associated data centre delegated act to collect energy and other environmental data, this could be acheived by simply amending the existing Climate Change Agreement, extending the provisions to all data centres located in the UK and reducing the threshold for compliance reporting to 100kW.
Interesting - he's 100KW is the value that would cover telecoms as well, I think
Under the report’s net zero scenario, gas use would peak around the middle of this decade before halving by 2050, compared with 2022 levels. But the current trajectory suggests gas demand will continue to grow throughout the forecast, expanding by about a fifth by 2050.In the scenarios, demand for liquefied natural gas, which is cooled to be transported on ships, climbs by 40% and 30% above 2022 levels respectively.
fall in oil, growth in gas
BP has predicted that the world’s demand for oil will peak next year, bringing an end to rising global carbon emissions by the mid-2020s amid a surge in wind and solar power.
Oil company says oil demand will peak next year
Putting the various figures together shows that, far from the modest 29% year-on-year increase in the incomplete NBS data, there was a record 78% rise in solar generation in May 2024.
This was compared to may 2023 - so a nearly 80% jump in a single year, for China (!)
Clean energy generated a record-high 44% of China’s electricity in May 2024, pushing coal’s share down to a record low of 53%, despite continued growth in demand. The new analysis for Carbon Brief, based on official figures and other data that only became available last week, reveals the true scale of the drop in coal’s share of the mix. Coal lost seven percentage points compared with May 2023, when it accounted for 60% of generation in China.
Next year coal will be below 50% in China if the pace is kept
They present a significant policy opportunity supported by the country’s ongoing efforts to develop the green bond market and can catalyze promoting green and high-quality development, creating jobs, and delivering environmental benefits.The Greenpeace East Asia report analyzed over 8,000 projects covered by newly issued provincial and municipal government bonds in 2021 as a sample and found that one in five could have been issued as such Green and Sustainable Municipal Bonds.2
so basically, assuming there is a market for green bonds in the private sector, there are loads of 'bondable' projects
“For our customer base, there's a lot of folks who say ‘I don't actually need the newest B100 or B200,’” Erb says. “They don’t need to train the models in four days, they’re okay doing it in two weeks for a quarter of the cost. We actually still have Maxwell-generation GPUs [first released in 2014] that are running in production. That said, we are investing heavily in the next generation.”
What would the energy cost be of the two compared like this?
As solar is displacing traditional assets, such as gas power plants, we are observing a relatively sharp increase in the price of some of the power reserves, which happens in the opposite direction than the prices on the day-ahead markets.
This implies a massive rise in value for being able to provide reserve dispatchable power, like from batteries
Germany's coalition government is set to overhaul the way renewable energy is subsidised so that power producers would get one-off support for their investment costs instead of a guaranteed price for power they produce, a finance ministry document showed on Friday.
So this would either:
a) reduce the need to borrow so much cash from risk averse bankers b) not really help with the price volatility / merchant risk thing that makes it hard to get finance
Mark Boost, CEO of UK-based cloud company Civo is similarly disappointed in the outcome.Boost said the deal is "not good news" for the cloud industry, and added several important questions still need to be answered."We need to know more about how the process of compensation will work," he said. "Will all cloud providers in Europe be compensated, or just CISPE members? Is this a process that will be arbitrated by Microsoft? Where are the regulators in this?"Boost added that the deal will benefit CISPE members only in the short term, but that the cloud industry and its customers will pay the price in the long-term.
This is a bit like how AWS will share sustainabilty data under NDA - works for that provider, but not everyone else.
However, EFRAG's move to publish a draft for consultation at the same time as the ESRS at the end of January 2024, which is aimed at SMEs not covered by the CSRD, came as a surprise: With the so-called Voluntary ESRS for Non-Listed Small- and Medium-Sized Enterprises the "VSME ESRS", EFRAG is addressing all those companies that are not covered by the CSRD and thus the aforementioned concretisations (i.e., ESRS and ESRS LSME), but which nevertheless wish to take similar measures.
So basically this some guidance on what companies who aren't covered by the CSRD ought to cover to demonstrate following the intent of the law
To start, titanium ore is heated to 1,800 degrees Fahrenheit and reacted with chlorine gas and carbon-rich petroleum “coke.” This step yields a liquid chemical, titanium tetrachloride, and also produces carbon dioxide as a byproduct (similar to how blast furnaces for ironmaking release CO2).
I didn't realise Titanium relied on electrolysis like alumnium too
There also exist well-known vulnerabilities for eBPF programs, which can allow attacksto break container isolation [13] and execute malicious code inthe kernel [22]. Since Wattmeter is built on top of eBPF and ac-cesses RAPL information, only privileged users should be allowedto access it.
So, vulnerable to platypus attack?
Figure 3 shows the effect of EFS on the CPU and energy sharesbetween the two processes
it shows them using about the same power instead of the visible difference on the charts
Another framework for implementing custom scheduling poli-cies without direct kernel modification is Meta’ssched_ext.sched_extis a Linux kernel patch that proposes a new abstract schedulerclass, which can be instantiated with scheduling programs in eBPFat runtime. There are ongoing discussion of upstreamingsched_extinto Linux. We have chosen to implement the scheduling poli-cies proposed in this paper with ghOSt, but it is also possible toimplement them withsched_ext
Wow, it might actually be upstreamed into linux proper?
He added that private companies, which will have a significant role in the transition, want to see policy certainty enhanced in the months ahead. The group is awaiting the finalisation of the Electricity Regulation Amendment Bill, which promises to open up the electricity market and put an end to Eskom’s longstanding monopoly, and the Integrated Resource Plan.
So, a market with more private generation firms, I'm guessing
consumes 19% less energy per event in high performance mode
in high power mode, it's more efficient for the amount of power it uses?/
%. This shows, that the impact of load shaping heavily depends on the power proportionality53 of the underlying hardware, and that it is not a reasonable measure per se.
Ah, so assumptions about physical hardware can't be blindly applied to cloud. Assuming the TEADS model is accurate
Although solar forecasts are not very promising this time, it again permits to discharge to 30%, as the carbon intensity during the next day is expected to be especially low. Instead of drawing carbon-intensive grid energy at night, the demand is thereby shifted to the next morning where the batteries are charged to 60% using cleaner energy.
In this case local generation is low, but the grid is relatively clean (maybe it's v windy, not sunny), so it's ok to run down the local store of greener energy in the battery
Although the carbon-aware experiment uses 2.4 % more energy than the baseline (which is because not all power modes have the same energy-efficiency), its associated carbon emissions through grid power consumption are 35.9 % lower. In the following, we will briefly analyze the two experiments and to demonstrate how our integration enables research and development of carbon-aware applications.
How much power draw compared to the battery does this set up have? 32,000 mAh would be how long at max power draw for a Pi?
The control unit adaptively adjusts the battery's minimum state of charge and grid charge rate over time. In particular, in case of promising forecasts for solar power production or low carbon intensity, it is able to temporarily deplete the battery to 30%.
So, PUT ing to the /soc end point with target charge and a new C value
This is especially the case when not testing virtualized applications on powerful hardware but embedded systems that often only run only one energy-hungry process at a time. In these systems, load shaping is likely rather performed on a device level, for example, through DVFS.
Also with exclusive use of a GPU server, right?
Therefore, if applications under test are deployed on physical nodes like single-board computers, it is recommended to use dedicated hardware for measuring the power usage of these devices. For example, in our experimental testbed we monitor the current and voltage of a RaspberryPi 4b with a USB to USB measuring device equipped with an INA219 DC current sensor.
What kit has a INA219 DC current sensor these days? How can I buy one?
Note: The physical node's power usage is controlled via DVFS, while the virtual node uses rate limiting on the executed process.
Ah, two strategies for "Change the speed" of the three approaches I list in this post
“These claims assume that a company that pollutes more now should be able to pollute into the future. This means Global North companies will continue to inequitably dominate use of our remaining carbon budget. These findings should raise real questions for any bodies that claim to set standards for voluntary corporate climate targets,” David Tong, Global Industry Campaign Manager for Oil Change International, said.
A major blindspot is the fact that SBTi does not take into account new companies with their share of emissions coming into existence in the future. In SBTi’s framework, existing companies are allocated a share of the carbon budget without leaving any room for new players, some of whom might be more efficient or even working in the decarbonisation space like solar technologies. This further entrenches fossil fuel developments by existing companies and also raises questions about equity.
New firms could be better than incumbents as emissions are ring fenced for incumbents
Net-zero corporate pledges are voluntary, which means they can be reeled back in as quickly as they are announced.Shell scrapped its emission reduction target for 2035 when it sought to grow its gas business, for example, and BP walked back on some of its climate commitments when profits hit a record high. An Oil Change International assessment of the climate plans of eight oil majors—Chevron, ExxonMobil, Shell, TotalEnergies, BP, Eni, Equinor, and ConocoPhillips—released this year found that all eight continue to drive fossil fuel expansion and six have explicit goals to grow their total production volume this decade
Not all do so and there is no agreement on what a fair share is, but by claiming an individual target is Paris compliant is implicitly making an ethical claim that this represents your fair share of the global response, without saying how it is fair,” she said.
The Climate Policy paper points to issues that arise when individual countries or companies link their efforts to mitigate climate change to Paris Agreement goals. If such linking is indeed necessary, the authors say that assumptions about time scale, spatial scale and equity must be included in the analysis and presented transparently.The paper is “a welcome contribution” because it asks countries to center mitigation claims in a context of national contributions to global fair shares of the climate response, said Kate Dooley, a research fellow in the School of Geography, Earth and Atmospheric Sciences at the University of Melbourne. At present, many wealthy countries have exceeded their fair share of the carbon budget.
The distribution of emissions over time is not just a question of cumulative global emissions—it’s also a matter of equity. Rich, industrialized countries have already claimed a disproportionate share of the carbon budget that has brought us to around 1.2°C warming today. The U.S. has emitted about 25 percent of cumulative emissions; Europe, meanwhile, has emitted around 20 percent.
Good point
In fact, capturing CO2 from ethanol production is so easy that ethanol production is the biggest source of purified CO2 in the US, and the second biggest source in Europe. This is where we source the CO2 that we use in fizzy drinks, food production, fire extinguishers, etc.
Interesting. I thought it was Steam Reformed Methane
In fact, capturing CO2 from ethanol production is so easy that ethanol production is the biggest source of purified CO2 in the US, and the second biggest source in Europe. This is where we source the CO2 that we use in fizzy drinks, food production, fire extinguishers, etc.
Interesting. I thought it was Steam Reformed Methane
it did so by taking some short cuts. Android itself was an acquisition in 2005 (for $50 million, with a keyboard interface) and, in response to the iPhone, a new touch user interface was developed
I totally forgot that android was an acquisition
In their annual financial report, companies will have to disclose sustainability information in the management report. The details of the sustainability information to be published are set out in Commission Delegated Regulation (EU) 2023/2772 of 31 July 2023, to which the French Decree 2023-1394
So, 2023-1394 is the thing to keep an eye on
That’s no small task: In 2022, SAP spent approximately €7.2 billion in purchases from more than 13,000 suppliers worldwide, its annual report shows; 30% of that being on cloud services – more on that below.
7.2bn - nearly 2bn is spend on cloud, all by themselves?
He is referring to a mechanism SAP first tested in 2016 across 10 countries. This prices CO2 (from €40 to €400 per ton; depending on the flight and its distance) to release money for sustainability initiatives, as SAP ramps up not just its own internal decarbonisation programme (it aims to reach net zero by 2030) but continues to build out a suite of sustainability products for its expansive customer base.
SAP have been using internal carbon pricing, for nearly ten years, and they are sophisticated enough to have different prices even within aviation
As 45% of last year’s record solar additions were distributed generation, the exclusion of small solar installations is affecting these numbers a lot more than it used to.
Wow, nearly half of China's solar addition is distributed? How is it funded?
If there was one universal piece of advice I had for marketers seeking to broadly improve their organic search rankings and traffic, it would be: “Build a notable, popular, well-recognized brand in your space
Recognisable brand, over content?
A module on “Good Quality Travel Sites” would lead reasonable readers to conclude that a whitelist exists for Google in the travel sector (unclear if this is exclusively for Google’s “Travel” search tab, or web search more broadly). References in several places to flags for “isCovidLocalAuthority” and “isElectionAuthority” further suggests that Google is whitelisting particular domains that are appropriate to show for highly controversial of potentially problematic queries.
We know they whitelist electrion sites. Would they do this with climate science
Our study suggests the transformation pathway will have three main phases.
peaking (slowing down fossil deployment and ramping up clean gen), then energy revolution (bulk transition), then consolidation (clean up)
Meanwhile, gas does not play a significant role in the power sector in our scenarios, as solar and wind can provide cheaper electricity while existing coal power plants
This is a surprise. I had assumed China might shoot for gas in as a stop gap too, but it looks like they're skipping it
The upper panel in the figure shows the installed capacity of coal power plants and the lower panel their electricity production from 2021 to 2060.
Wow, China has more than a terawatt of coal generation in 2024
What is happening in China with electric vehicles is pretty stunning. China is the world’s largest auto market by far — in 2022 China sold 26.8 million vehicles, the U.S. sold 13.8 million and Japan was third with 4.3 million.
Holy shit.
If EVs are making half the cars sold in China, then are more EVs are being sold in China than cars and trucks in the USA ?
There is mounting evidence that the demand for imported cooking oil in the UK and Europe is being met with virgin palm oil that has been fraudulently passed off as waste. This would cancel out the fuel’s emissions savings, due to the land clearances for oil palm plantations.
YIIIIIKES
To produce the 12,750 MWh that will be generated by the data center at full capacity, 2660 t of pellets would have to be burned per year
This suggests that SIg is buying the heat, which is what makes the whole thing economically workable
Instead of wasting the 45°C hot air from equipment and servers in the atmosphere, the new data center will inject this flow into air-to-water heat pumps; these will raise the temperature from 45°C to 67-82°C in order to adapt to the current requirements of the district heating installations of Services Industriels de Genève (SIG).
This suggests they are just complying with the law now? Or are they going beyond what is required?
First, it will require the operators of regional grids across the country to forecast their region’s transmission needs a full 20 years into the future, develop plans that take those forecasts into account, and update those plans every five years. In practice, this should mean a more robust consideration of new wind and solar options, as well as greater adherence to the net-zero emissions targets set by many U.S. states.
how far ahead did they need to forecast before?
Over the past three years, battery storage capacity on the nation’s grids has grown tenfold, to 16,000 megawatts. This year, it is expected to nearly double again, with the biggest growth in Texas, California and Arizona.
10x in 3 years?!
The critical sixth key was the contest key: Bernie Sanders’s contest against Clinton. It was an open seat so you lost the incumbency key. The Democrats had done poorly in 2014 so you lost that key. There was no big domestic accomplishment following the Affordable Care Act in the previous term, and no big foreign policy splashy success following the killing of Bin Laden in the first term, so there were
Bernie sanders was the missing key for the Hillary loss then?
Now, the United Nations has taken a first step toward filling in these data gaps with the latest installment of its periodic report on e-waste around the world. Released last month, the new Global E-Waste Monitor shows the staggering scale of the e-waste crisis, which reached a new record in 2022 when the world threw out 62 million metric tons of electronics. And for the first time, the report includes a detailed breakdown of the metals present in our electronic garbage, and how often they are being recycled.
I did not know Carbonflow was a thing that was done using PyPSA
Following an attributional approach, and with the assumptions retained, the Iroco study evaluates the greenhouse gas emissions attributable to one week's use of a mailbox at 63.2 gCO2eq.
So about 3.5 kg per year
However, corporate governance in the telecoms sector is notconducive to the kind of radical decisions that separationrequires. Decision makers and boards are incentivised onmetrics that require short term continuity, not disruption
So to make this argument in cloud/AI, you need to make the argument that firms are too conservative in terms of corporate structure, and as a result are missing out on greater value coming from two individually more valuable entities - something that the org or compensation structure is not set up for
Rural broadband in Europe is increasingly deployed in aneutral host model with a single fibre networkconnecting homes and multiple service providersdelivering services over that network.
This is like the unbundled approach already then
as pure infrastructure players borrow atlower rates and expect lower returns, they will deployinfrastructure where vertically integrated players won’t,and with no (or less) public subsidies;
Infra with more patient capital isn't chasing such high risk returns like getting everyone to buy into AI
We have one leading example of voluntary structural separationresulting in increased valuation of the separated entities -Telecom NZ, in 2011. Post de-merger, after years of flat marketvaluation, both Chorus (infrastructure) and Spark (services) sawshare prices rise. Between 2015 and 2023, the combined market1 Tower companies (or towercos) are infrastructure companies that managemobile towers for multiple mobile network operators.capitalisation of Chorus and Spark grew by 150%, whereas thatof European and US vertically integrated telcos2 grew only by15%.The plan also delivered full FTTH coverage to 87% of NewZealand homes with 72% adoption.
So breaking them made led to greater share price raises AND better service in New Zealand
So should we collectively accept a radical shift in our policy and regulatory framework in order to ensure that the beacon of the European economy can deploy infrastructure that Sweden, Denmark, Spain, France, and many of the smaller European countries started successfully deploying long ago? This, in a nutshell, is why the Commission never talks about Germany.
This appears to be making the argument that rather than admit Germany has underinvestment compared to the rest of the EU, it's easier to change all the entire EU telecom policy so vertically integrated telcos can make a better return.
The article confirms many of my suspicions — that, as The Information wrote, "other software companies that have touted generative AI as a boon to enterprises are still waiting for revenue to emerge," citing the example of professional services firm KPMG buying 47,000 subscriptions to Microsoft's co-pilot AI "at a significant discount on Copilot's $30 per seat per month sticker price." Confusingly, KPMG bought these subscriptions despite not having gauged how much value its employees actually get out of the software, but rather to "be familiar with any AI-related questions its customers might have."
This seems like spending 17 million a year to help them sell consulting about what AI.
It's important to understand these complexities because they are currently being flattened not just by corrupt government officials in Global South countries or fossil fuel executives in Texas, but also by a whole ecosystem of pundits like Jordan Peterson, Michael Shellenberger, Alex Epstein and even Joe Rogan, who use the idea that fossil fuel development will solve poverty in Africa as justification for continuing fossil fuel's dominance in the world. In fact, fossil fuel development hasn't even solved energy poverty in the African countries that have embraced it; Nigeria has the continent's largest and oldest fossil fuel industry and yet still has the world's lowest energy access rates.
Wow, I had no idea that energy access was so low in like this.
To plug the financing hole Intel has had to rely on a wide variety of capital sources to fund everything: traditional debt financing, government support, and even more creative financial engineering schemes like the Brookfield fab deal to find a way to pay for everything.
thisis what? around 60bn of government subsidy?
As noted by researchers at the University of Oxford and the Mercator Research Institute on Global Commons and Climate Change, carbon-intensive sectors like electricity generation actually contribute relatively little to the world economy, compared to high-value but lower-emitting sectors like IT, real estate, and social work.
what? the industries below RELY on electricity!
The above report is an underestimation of datacenter power demand, but there are plenty of overestimates too
Wow, so semi-analysis is seeing the IEA estimates as a low ball? I did not see that coming
The FTC, of which Khan is now chair, contends that Amazon has found a way to push up prices after all, without losing shoppers. It does this, the FTC argues, by imposing ever-higher fees on third-party sellers, who have no choice but to pass these costs on to shoppers by raising their prices. If those businesses try to sell more cheaply elsewhere, Amazon throttles their sales on its platform—which, thanks to the central place in e-commerce that Amazon has built up over the years, can be a death sentence for these businesses. So, according to the FTC, sellers generally absorb the high fees by inflating their prices not just on Amazon but across the web—precluding the price competition that could loosen Amazon’s grip on e-commerce.
This sounds like the "favoured nation status" from Cory's book
The disclosure required by paragraph 33 shall include the total energy consumption in MWhrelated to own operations as follows
and superconductors will at some point do to conventional power cables what fibre-optics did to conventional telephone cables.
? I assume this is referring to stuff like graphene.
Because we use systemd for most of our service management, these stdout/stderr streams are generally piped into systemd-journald which handles the local machine logs. With its RateLimitBurst and RateLimitInterval configurations, this gives us a simple knob to control the output of any given service on a machine. This has given our logging pipeline the colloquial name of the “journal pipeline”, however as we will see, our pipeline has expanded far beyond just journald logs.
I did not expect to see journald being used as the basic building block
Overall, Iceland produces approximately 20 terawatt hours of electricity per year.
About the same as Google?
Landsvirkjun built Kárahnjúkar despite protests, it started operating in 2009. But the controversy changed Iceland's energy policy. A fourth aluminium smelter was never finished, and in an announcement made in 2022, Landsvirkjun writes that it "will for the time being not focus on attracting new large-scale customers in the metal industry or other industrial commodities."
I did now there was push back even power in Iceland
Meta previously bet on CPUs and its own in-house chips to handle both traditional workloads and AI ones. But as AI usage boomed, CPUs have been unable to keep up, while Meta's chip effort initially fizzled.
I did not know they tried to make their own chips
It’s past time for the IT and BPO Services industry to jump to a new S-curve driven by technology arbitrage if they wish to get back to another season of hockey stick growth:
Yikes. So basically, time to fire the people we outsourced to, and replace them with ChatGPT ?
wrote about this in 2012 in a book called Liars and Outliers. I wrote about four systems for enabling trust: our innate morals, concern about our reputations, the laws we live under, and security technologies that constrain our behavior. I wrote about how the first two are more informal than the last two. And how the last two scale better, and allow for larger and more complex societies. They enable cooperation amongst strangers.
Morals and reputation
Laws and tech
They were national strategic programs. The strategic programs were aligned with nuclear weapons programs. The government picked and enforced a single design for all of the reactors. The reactors were GW-scale due to thermal efficiencies required for cost effectiveness. The government ran human resourcing. The programs ran for 20 or 30 years. They built dozens of nuclear reactors to maintain the teams and momentum and to share lessons learned.
These are 5he seven conditions necessary for nuclear to work, asked on the historical evidence
A key point to remember about the US DOE is that 55% of its budget is related to commercial nuclear generation. The other 45% covers dams, geothermal, wind, solar, tidal, wave, biomass and biofuel energy.
An! Numbers here! This is super helpful!
A second MoU on gas export partnership was agreed between Riverside LNG of Nigeria and Germany's Johannes Schuetze Energy Import AG. Under the accord, Nigeria will supply 850,000 tons of natural gas to Germany annually which is expected to rise to 1.2 million. The first deliveries will be in 2026, Ngelale said.The deal will help process about 50 million cubic feet per day of natural gas that otherwise would have flared.
ok, so instead of being flared for no money, it's being bought by German to burn instead.
Volkswagen teamed up with ClimatePartner GmbH to generate offsets so it can be “independent from the market where you can’t control what’s effective and what’s not,” said Esra Aydin, a spokesperson for the automaker. But the credits didn’t materialize in time last year and it ended up going back to the voluntary market where cheap renewable-energy credits were the only ones that worked with its budget.
"Worked with its budget"
In other words, the companies decided some offsets weren’t good enough to meet their own climate goals — and yet have continued selling them to customers as a feel-good solution.
That's a quite a quote
In contrast with the precipitous fallings-off of platforms like Myspace and LiveJournal, the decays of which have been clear and irreversible, Tumblr has kept defying expectations and stayed somewhat thriving against all odds. That is, in part, thanks to the guidance and guardianship of Automattic, which acquired Tumblr for the bargain-bin price of $3 million.
Holy balls. Only three mil?
The two corporations aren’t directly buying fuel from World Energy. Instead, they’ll purchase certificates representing SAF that gets pumped into the larger supply chain — then count the associated carbon reductions toward their sustainability goals. Microsoft agreed to buy certificates representing 43.7 million gallons of SAF over a 10-year period, while DHL signed a contract for 177 million gallons over seven years.
These are basically like EACs for fuel!
Provided that reliability and supply improve, the grid could become the optimal solution to provide almost 60% of people with access to electricity in each scenario.In the AC, Nigeria achieves universal access by stepping up efforts to provide off-grid solutions to those populations that live far from a grid.
Wow, really that much off grid generation?
However, some have criticised the CIPP for providing market-rate loans rather than special financing schemes, which will lead to high costs for Indonesia and could deter other countries from accepting similar deals in the future.
Ouch! What's the interest rate in Indonesia compared to North America for energy infra?
Here’s another problem: from 2020 to 2022, shopify claims around 31,000 tonnes of carbon removal. But according to the CDR.FYI database, only 4,000 of the tonnes they have purchased have been actually physically removed from the atmosphere
If you have increases in temporal resolution for electricity, why not for CDR?
Shopify don’t count the emissions footprint of the products sold by merchants in their actual climate data. No shipping, no manufacturing emissions, nothing (Amazon play a similar trick).
This an interesting point - shopify can argue they do it to avoid double counting, but that’s not really what scope 3 is designed for
It spent just over $5m to advertise its compact Bolt option over the same time period. Ford spent $61.2m (£48.7m) to advertise its electric F150 , and around $9m to advertise its mid-size electric Mustang. Among the top automotive advertisers, only BMW and Hyundai are spending as much or more to market their more efficient EVs.
the f150 is its biggest selling autombile. I'd assume this would be higher though, surely?
The CDP, previously known as the Carbon Disclosure Project and the most comprehensive global registry of corporate carbon emission commitments, recently said that of the 19,000 companies with registered plans on its platform, only 81 were credible.
Sheesh, less than half of one percent of the plans shared with CDP were credible?
Over a year that equates to roughly seven million in savings and is precisely why the biggest names in tech are moving to colder, more remote locations.
How big does a DC need to be for this 7 million?
Downtime costs roughly $10,000 perminute in a hyperscale and is categorised as the highest risk.
This is pretty wild quote. I wonder what the source is?
The final coupon has been set at 4.183 percent
Borrowing a basically billion dollars at 4.2 percent
CSP makes it possible for server administrators to reduce or eliminate the vectors by which XSS can occur by specifying the domains that the browser should consider to be valid sources of executable scripts. A CSP compatible browser will then only execute scripts loaded in source files received from those allowed domains, ignoring all other scripts (including inline scripts and event-handling HTML attributes).
I don't think I've come across this before but I did on a recent project. I didni't know you could block inline styles or inline javascript in this way
The approximate cause of power problems often isn’t that hard to find. Fixing them is often the hard part. Good luck.
Quotable
Intel processors also support multiple P-states. P0 is the state where the processor is operating at maximum frequency and voltage, and higher-numbered P-states operate at a lower frequency and voltage to reduce power consumption. Processors can have dozens of P-states, but the transitions are controlled by the hardware and OS and so P-states are of less interest to application developers than C-states.
These exist too, but we can only control them indirectly at best.
Intel processors have aggressive power-saving features. The first is the ability to switch frequently (thousands of times per second) between active and idle states, and there are actually several different kinds of idle states. These different states are called C-states. C0 is the active/busy state, where instructions are being executed. The other states have higher numbers and reflect increasing deeper idle states. The deeper an idle state is, the less power it uses, but the longer it takes to wake up from.
Mental note: Think "C for the CPU cool down"
Even before Hardin’s ‘The Tragedy of the Commons’ was published, however, the young political scientist Elinor Ostrom had proven him wrong. While Hardin speculated that the tragedy of the commons could be avoided only through total privatisation or total government control, Ostrom had witnessed groundwater users near her native Los Angeles hammer out a system for sharing their coveted resource. Over the next several decades, as a professor at Indiana University Bloomington, she studied collaborative management systems developed by cattle herders in Switzerland, forest dwellers in Japan, and irrigators in the Philippines. These communities had found ways of both preserving a shared resource – pasture, trees, water – and providing their members with a living. Some had been deftly avoiding the tragedy of the commons for centuries; Ostrom was simply one of the first scientists to pay close attention to their traditions, and analyse how and why they worked.
The features of successful systems, Ostrom and her colleagues found, include clear boundaries (the ‘community’ doing the managing must be well-defined); reliable monitoring of the shared resource; a reasonable balance of costs and benefits for participants; a predictable process for the fast and fair resolution of conflicts; an escalating series of punishments for cheaters; and good relationships between the community and other layers of authority, from household heads to international institutions.
Among his proposed solutions to the tragedy of the commons was coercive population control: ‘Freedom to breed is intolerable,’ he wrote in his 1968 essay, and should be countered with ‘mutual coercion, mutually agreed upon’. He feared not only runaway human population growth but the runaway growth of certain populations. What if, he asked in his essay, a religion, race or class ‘adopts overbreeding as a policy to secure its own aggrandisement’? Several years after the publication of ‘The Tragedy of the Commons’, he discouraged the provision of food aid to poorer countries: ‘The less provident and less able will multiply at the expense of the abler and more provident, bringing eventual ruin upon all who share in the commons,’ he predicted. He compared wealthy nations to lifeboats that couldn’t accept more passengers without sinking.
YIKES
Mobile network operators, often government sanctioned monopolies granted via spectrum licenses, are in increasing competition with an explosion of “open spectrum” technologies and new operating models for wireless networks.
there are well understood risks of turning unchaperoned engineers loose on social problems
there are well understood risks of turning unchaperoned engineers loose on social problems. And more so as the engineering tools become more specialized: if your only tool is an abstract algebraic curve, all the world becomes a cryptographic nail.
Crypto as the hammer that turns everythign into a nail.
This liminal space, often vaguely described as the “Internet of Things” (IoT) doesn’t hold much resemblance to the Internet, at least as originally conceived. And the “things” commonly encountered are toasters, washing machines, and toothbrushes, with an often inexplicable desire to “connect.”
IoT’s current failures are also an opportunity. Beyond the absurdity of overly chatty microwaves, plugging the physical world directly into extractive (and often tenuous) business models has raised public awareness of the shortcomings of how we currently build digital infrastructure. Whether it’s a bike that stops working when its manufacturer goes bankrupt, or a home appliance with unclear allegiances, it has created real harms from exfiltration of personal data, and unnecessarily reducing the lifespan or increasing fragility of devices we depend on as part of daily life.
Overly chatty microwaves, and home appliances with unclear allegiances - there are so many quotable bits in this post!
a home appliance with unclear allegiances,
This is wonderful turn of phrase
For an example of an unintended consequence, let’s say the result of your optimization project is spare capacity at a cloud provider. Then that capacity is then offered on the spot market, which lowers the spot price, and someone uses those instances for a CPU intensive workload like crypto or AI training, which they only run when the cost is very low. The end result is that capacity could end up using more power than when your were leaving it idle, and the net consequence is an increase in carbon emissions for that cloud region.
This is because capacity, utilisation and power used are related, but different concepts.
Your capacity, which you share back to the pool is then avaiable to someone else, who ends up buying it to use for a task that has a higher average utilisation, resulting in more power being used in absolute terms, even if less is attributed to you.
This also raises the question - who is responsible - the customer for making the capacity available, for the cloud provider who accepts this workload?
The cloud providers get to set the terms of service for using their platform.
King Kamehameha
THAT'S where the Kamehameha comes from?
Average retail electricity rates in Hawaiian Electric territory are higher today than they were a decade ago, per state data. Kauai’s rates have dropped from the highest in the state to the lowest, as KIUC shifted from costly imported fossil fuels to cheaper solar generation.
WOW
but also depend on a global renewable energy production whose capacity cannot exceed 30% globally (EIA),
I don't understand this reference
It should be noted that in France, regulations do not allow this market-based approach when reporting company level CO2e emissions : “The assessment of the impact of electricity consumption in the GHG emissions report is carried out on the basis of the average emission factor of the electrical network (…) The use of any other factor is prohibited. There is therefore no discrimination by [electricity] supplier to be established when collecting the data.” (Regulatory method V5-BEGES decree).
Companies are barred from using market based approaches for reporting?
How does it work for Amazon then?
In the end, there was an undisclosed settlement between Verizon and Mozilla, but ComputerWorld later reported that financial records showed a $338 million payment from Verizon in 2019. On top of revenue-sharing with Google, that payment drove up Mozilla's revenue, which in 2019 reflected "an 84 percent year-over-year increase" that was "easily the most the open source developer has booked in a single year, beating the existing record by more than a quarter of billion dollars," ComputerWorld reported. Perhaps that bonus payment made switching back to Google even more attractive at a time when Baker told the court she "felt strongly that Yahoo was not delivering the search experience we needed and had contracted for."
Wow, it represented a 340 million USD bonus to switch from Yahoo to Google?
Finally, over a longer time frame, novel technologies could drastically lower battery-related mineral demand for nickel and copper in particular, but the mineral intensity of next-generation battery chemistries remains uncertain and could even increase demand for some battery minerals. For example, solid-state battery chemistries could increase lithium demand by up to 28%.11
Why are we not talking about copper and nickel, and talking so much about lithium and conbalt?
Is it just the relative novelty?
Figure 1. Comparison of ore extraction in IEA NZE scenario and sensitivity of substantially improved recycling and potential ore grade decline
This implies that total mining would go DOWN under a transition, possibly by half.
The ore grade decline is less of an impact than I expected too.
Ore extraction is largest for EVs, growing 55 times from 2021, compared to 13 and 9 times for solar PV and wind power, respectively
The share of mining going to cars is not really something we discuss in digital technology discussions enough
This means that the high demand for minerals through the energy transition is of a (temporary) stock building nature, while fossil-related extraction is continuous and dissipative. In the longer term, the decommissioning of end-of-life renewable generation provides opportunities for reuse or recycling. This can mitigate demand of primary produced minerals for new renewable installations
We can use the byproducts of critical mineral mining. Not so much with fossil fuels
For the following types of tasks, users did NOT appreciate being sent to a new browser tab or window:
For starters, if you are seeking to use a green financial instrument to finance your data construction project anywhere in the world, and have European investors, then the Taxonomy Climate Delegated Act (TCDA) will apply. This requires the operator to implement 106 of the EU Code of Conduct for Data Centre (Energy Efficiency) best practices as well as undertake various other activities
Oh, this is new to me - if you want the cheap money, yo need to meet the ECOCDC
While it is capable of providing constant power, hydrogen fuel cells are also being considered for providing backup power to data centers. This is greatly appealing to data center operators as a more environment-friendly replacement for traditional diesel generators. This change would see the use of fast-start fuel cells, such as proton exchange membrane (PEM) fuel cells which could take the place of diesel generators.
Proton exchange membranes are presumably the newer types of fuel cells, that are more flexible and need less heat, but are more pricey
The key benefit and primary motivation for installing hydrogen fuel cells within a data center is to reduce carbon emissions. As stated, some fuel cells, such as SOFCs, can use natural gas. While it is less damaging to the environment than diesel, it still results in significant carbon emissions.
Is this some of the missing context for the CCS next to datacentre patents from M$?
If you can capture the CO2, and use the waste heat to separate the CO2 from the absorbing material, then it might improve the economics of the SOFC fuel cells, AND deal with the CO2 emissions problem.
SOFCs and PEM fuel cells differ from one another in their construction, materials, and operation. In a high-level view, the primary differences are the electrolyte materials (where the hydrogen and oxygen react) and operating temperatures. SOFCs operate at high temperatures, requiring longer start-up times and as a result, only being suitable for continuous power supply. PEMs, by contrast, operate at lower temperatures and are capable of fast-start or continuous operation, but are a more expensive option.
Oh wow, so there are two kinds of fuel cells, and the expensive one is the fast ramp up one
The majority of fuel cells currently in use in data centers are solid oxide fuel cells (SOFCs), providing constant power. SOFCs can generate power through the conversion of fuels such as natural gas and biogas, into hydrogen, which is then reacted in the fuel cell to generate power. While using natural gas still results in carbon emissions, SOFCs are able to generate power with higher efficiency than combustion engines. SOFCs can also be fuelled directly with hydrogen, although this is not the norm due to hydrogen’s cost and availability.
Wow, greater efficiency than gas turbines / engines?
decade ago, these plants provided “low carbon” electricity in comparison to the grid at the time, but now in many cases, emit more carbon than local grids. Countries that already have decarbonized grids, France, Sweden, and Scotland, for example, will not benefit from a continuous system that uses natural gas to begin with.
What would the green premium be for bio methane in this scenario? As in Methane from bio genie sources. Supply issues aside, obvs.
One suggestion is for greater integration of energy systems, which would see data centers located adjacent to energy industries or having data centers integrated with hydrogen-generating plants and fuel cells. This solution sidesteps planning problems by locating data centers alongside low-carbon energy industries. A major issue with this (aside from the available land) is blurring the lines between the data center operators, utility providers, and energy companies.
This flips the assumptions of what a datacentre looks like, if the power density keeps increasing, instead of a DC with on-site power, you might have a power with on-site DC
From a mechanical perspective, designing a hydrogen storage system is significantly more complex than a diesel storage system. Hydrogen has more storage options available however, it presents higher risks than diesel such as greater flammability and explosivity, higher pressures, potential for low temperature, or chemical storage methods which are all hazardous. This therefore requires the mechanical design for such a system to comply with rigorous safety standards.
Ok, so h2 unsurprisingly is wast more explodey, and hard to store safely and cheaply
equipped with a divestment strategy for non-compliant assets.
mandatory divestment? blimey
And if you look at the total consumption of semiconductors by the Chinese manufacturing industry, then China imports more semiconductors than they import oil.
China spends more importing semiconductors than on importing oil? really?
This is from Peter Wennink, CEO of ASML
Google has argued that switching search engines is just a click away and that people use Google because it's the superior search engine. Google also argued at trial that Microsoft's failures with Bing are "a direct result of Microsoft’s missteps in Internet search."
This is interesting - I wonder how 3rd parties like Mozilla or Vivaldi testify?
If they say it's hard, they contradict their own marketing, and risk their main source of revenue.
If they say it's easy, they risk undermining all their own comms around the importance of choice, and the necessity of more diverse ecosystems.
Provide a transitional implementation for network operators and protocols that do not yet support standards-based Layer 2-3
This suggests to me that there is a lot of proprietary layer 2-3 in IOT. Is this the case?
Despite the unknowns, the technology definitely piques the interest of owners and operators of data center facilities, with analyst firm Omdia noting that more than 80 percent of survey respondents will most likely deploy grid-interactive UPSs within the next five years. As usual for this industry, the technology will require more deployments and for it to mature before it becomes the standard way data centers and other mission-critical facilities will be built.
I wonder how much extra flex and capacity this would represent?
While exact numbers are not disclosed, such as battery capacity and how much Microsoft is willing to make available for grid interactivity, the company claims that, over the next couple of years, this move will remove about two million metric tons of carbon dioxide emissions that would otherwise be generated from Ireland’s National Grid.
This is about an eighth of Microsoft's reported emissions in 2022, I think
Let’s just pause to emphasise this point: no amount of R&D, innovation or technology will allow us to remove CO2 from the atmosphere without spending at least 191 kWh per tonne of CO2.
Even if we acheive perfect efficiency we tax the global economy 10% to remove annual emissions
Table 5.10a. Solar REC prices during the period March 2020 – March 2021 (US$/MWh)
This are nearly 100x the cost of Hydro RECs in come cases
It should be noted that the third normative approach considers the global need for electricity asoutlined by the International Energy Agency (IEA) for different scenarios and develops an interim1.5DS within which ICT should not expand its current share of electricity. This electricity budgetuses the IEA trajectories for a 2°C scenario (2DS) and a below 2°C scenario (B2DS) to derive a 1.5°Ctrajectory for world electricity usage through doubling the difference between them and subtractingit from the 2DS as described in [IEA ETP]. This is an interim approach as IEA has not yet establisheda 1.5DS. The budget is then used to determine the amount of electricity that could be used by thesector if keeping its share at the current level. As the IEA is planning to include a specific 1.5DS, thetrajectories will be reviewed when the new IEA scenarios are published.
This is the only mention of a "fair share" of global electricity use by the ICT sector
The VIC system created maintains a list of available URLs or DNS entries for the service requiring computation, each of which live in a different datacenter in a different part of the world. When the service receives a request, it routes that request to the datacenter that best matches the required criteria
LOL it's DNS again!
A TCP/IP based system was built to investigate the feasibility of a VIC as per Fig. 1. This considers routing Microsoft Azure workloads to datacenters around the world to best satisfy the required criteria. Three criteria were applied: (1) reduce load on a particular electricity grid at peak times, (2) follow green energy and low emissions energy around the world with computation, thus reducing curtailment of renewables and facilitating more renewable energy and (3) take advantage of variable electricity market prices.
Three criteria reworded:
(1) reduce peak load on the particular electricity grid (2) reduce fossil generation in the compute (3) reduce cost of electricity use for the compute
Furthermore, many markets grant Capacity Payments to eligible generators and interconnectors for being available to follow dispatch instructions, irrespective of actual generation. VICs would also be eligible to receive such payments but they should be directly to the VICO rather than any users. This allows users to bid without the influence of capacity payments and rewards the datacenters providing the VIC.
Under this scenario, a grid operator would pay a datacentre to be prepared to switch off local physical load.
The load would still be served through, albeit on a different grid or grid region where you didn't have the same strains on the electricity network.
Potential to combine with steam turbines to generate more power
I assumed gas turbines use water to heat up steam to generate the power. This suggests this is not the case.
gas engines & gas turbines
engines and turbines are different things
Teleoperators are the world’s second-largest consumer of batteries. Elisa is also offering its Distributed Energy Storage solution to teleoperators in other countries so that they can improve the reliability of their own mobile networks and do their part in accelerating the green transition by investing in a distributed battery reserve and utilising it to provide balancing services in their electricity markets.
What is a teleoperator?
Net Zero Sales covers scope 1, 2 and 3 emissions on an intensity, full equity-share basisacross the value chain and seeks to reduce these:a. By 5% by 2025b. By 15-20% by 2030c. To net-zero by 2050
Net Zero Emissions Commitment covers scope 1, 2 and 3 emissions on an intensity, partialequity-share basis across the value chain and seeks to reduce these:a. By 15% by 2025b. By 28% by 2030c. By 55% by 2040d. To net-zero by 2050.
Net Zero Production covers scope 3 emissions on an absolute, full equity-share basis inthe upstream sector, excluding 3rd party crude, and seeks to reduce these:a. By 10-15% by 2025 [20%]b. By 20-30% by 2030 [30-40%]c. To net-zero by 2050
Net Zero Operations covers scope 1 and 2 emissions on an absolute, operated-asset basisacross the value chain and seeks to reduce these:a. By 20% by 2025b. By 50% by 2030c. To net-zero by 2050
“A reporting organization should not purchase renewable electricity and simply apply it to scope 3 emissions without involvement from its supplier or customer.” Renewable Electricity Procurement on Behalf of Others: A Corporate Reporting Guide (page 4), EPA, 2022.
Microsoft’s 2021 Environmental Sustainability Report includes 11 of the 15 scope 3 categories (page 19), while Google reports business travel and employee commuting as one total and “other” scope 3 emissions in a second total (page 11). Apple(page 84) and Amazon (page 97) report lifecycle emissions from customer trips to physical stores under scope 3 which are not categories prescribed by the GHG
Certigy, a European EAC registry, has enabled hourly certification across many EU countries
This is a tool offered by Unicorn.com specifically for energy reporting
Apple, which has a relatively long history of reporting its scope 3 emissions, states in its 2022 Environmental Progress Report that it is actively evolving its scope 3 accounting methodology. “In fiscal year 2017, we started calculating scope 3 emissions not listed above. In fiscal year 2021, these include electricity transmission and distribution losses [...] and life cycle emissions associated with renewable energy. We have not accounted for emissions resulting from employees working from home [...] we are still evolving our methodology.“ Environmental Progress Report (page 84), Apple, 2022
This also explains why, even though Norway’s grid-levelemissions factor is 10 kg CO2/MWh17 (98% carbon-free), the residual
emissions factor is 402 kg CO 2/MWh (7.4% renewable), reflectingthat most EACs produced within the Norwegian grid are claimedand retired outside of the country.
Data demonstrate that many companies do indeed pursue thispractice. For example, Norway was responsible for 43% of allguarantees of origin (GOs) exports in Europe in 2022, many ofwhich were purchased by companies whose operations have noconnection to the Norwegian grid on which these EACs wereproduced.
Wind and solar are, of course, intermittent, but battery costs too are plummeting, to the extent that they often underbid so-called peaking plants burning natural gas
Where would I look to find public evidence of this?
This article in effect requires local authorities to only use data centres that a fully compliant with the requirements of the specified directives.
Minimum, mandatory standards for public procurement.
Article 33 Delegated Acts 3. The Commission is empowered to adopt delegated acts in accordance with Article 34 to supplement this Directive by establishing, after having consulted the relevant stakeholders, a common Union scheme for rating the sustainability of data centres located in its territory. The Commission shall adopt the first such delegated act by 31 December 2023. The common Union scheme shall establish the definition of data centre sustainability indicators and shall set out the key performance indicators and the methodology to measure them.
There's a policy deadline to work - end of this year, so it's likelt there'll be a push to try to weaken it even more.
5. By 15 May 2025, the Commission shall assess the available data on the energy efficiency of data centres submitted to it pursuant to paragraphs 1 and 3 and shall submit a report to the European Parliament and to the Council, accompanied, where appropriate, by legislative proposals containing further measures to improve energy efficiency, including establishing minimum performance standards and an assessment on the feasibility of transition towards a net-zero emission data centres sector, in close consultation with the relevant stakeholders
. The Commission shall establish a European database on data centres that includes information communicated by the obligated data centres in accordance with paragraph 1. The European database shall be publicly available on an aggregated level.
It will be law to collect this data now, so we know orgs will have to have it in structured form.
If it's gonna be in the public domain anyway, there's a good argument for paving a path to make it easy to look transparent, by making it easy to disclose this info in a well presented way human readable way
We think that the threshold for reporting being 500kw is too low, as it will not pick up aggregated edge sites (organisations that have multiple facilties that individually are below the threshold but aggregated would be considerably higher), Mobile Phone basestation sites and Points of Presence (PoPs)
So ths lobbying will have now effectively hidden 4G/5G infra, as well as a lot of edge
This links back to the lower limit for reporting of 500kw, clearly the EC see no reason to collect data from 'distributed compute' such as individual server rooms (at this time)
This is very interesting, we've seen some national government initatives in various countries relating to the collection of data centre energy data, in the UK we have the Climate Change Agreement for Data Centres which started in 2014 and according to the latest figures (4th Period) indicated that total UK energy use of the period for commercial data centres (colocation sites) was 12TWh. Back to the EED and the very interesting thing is the mention of 'interventions' will this mean the imposition of fines for poor energy performance?
This introduces the requirement for a data centre register and the rating of data centres for sustainabilty, this could become an utter bag of worms, as sustainability is closely linked to the amount of renewable energy systems on an individual country grid, the 'grid mix' and could favour those countries and data centres that are directly linked to low carbon energy sources (Norway, Sweden, Finland (Hydro), France (Nuclear), Spain (Solar/Wind) & Denmark (Wind).
This also means that though that countries ought to be less keen on selling their EACs though
It’s pretty simple: don’t let carbon removal excuse ongoing or worsening emissions. That means no deal with fossil fuel majors. And if you really must sell carbon credits, here’s my idea: they can only apply to the earliest emissions first. Once we’ve dealt with the ~2,500 ish human-added CO2 gigatonnes in the atmosphere, THEN you can apply credits to new emissions. We go from oldest to newest, not from newest to oldest.
What about the compounding of warning for these tonnes?
AIVF
wagtial
imporvements
sustaninable