1,018 Matching Annotations
  1. Oct 2024
    1. The financing strategy was widely used during the 1997-2010 Labour government, and finally ditched by the Conservatives in 2018 after several NHS trusts required bailouts stemming from the high cost of the PFI schemes. Existing contracts were untouched.A report by the National Audit Office, the spending watchdog, that year found taxpayers had incurred billions of pounds in extra costs for no clear benefit through PFIs, with yearly fees running at £10bn.PFI deals are on average 31 years long and generally involve the assets reverting back into public hands at the end. Nearly 71 PFI contracts worth about £4bn are coming to an end in the next four years. More than 300 PFI contracts will expire in the next decadeNumber of PFI contracts expiring, by financial year20502045204020352030202520200604020344 expirations344 expirationsSource: National Audit OfficeA common trigger for clashes when contracts near expiry is the condition of the assets, and the question of who should pay for any deficiencies.

      Oh, so this is useful and interesting because I remember hearing about these when they were first proposed. Essentially, one of the problems even when infrastructure revert to public hands is that it's in crap shape and it's not clear whose job it is to pay to fix it up.

    1. The tech companies, which declined interview requests, said every watt of power they use for their data centers is matched with purchases of clean energy elsewhere on the regional power grid. But those contracts feed into a vast power grid, spanning 14 states from Louisiana to Montana. Many experts and activists say much of that clean power would probably get produced whether the tech companies were signing contracts or not.

      Which grid would this be?

    1. In detail, this comprises a change in market dynamics from a “search engine market” to a “web data market”, where web data is used in manifold applications (like social media research, training of AI products, price monitoring, etc.).For this study, we remain aware of such disruptive forcesof altered search, but have not explicitly integrated this in our estimates –although, LLMs still need valid and scalable (web) information for training,which is present in our market model.

      Increasing amount of this stuff are now blocked. It's becoming the norm now.

    1. If such battery capacity had already been installed this summer, Germany could have displaced 36 GWh of expensive fossil power during evening peaks in June alone.

      I think this is referring to the 2 GW of storage in the report highlights. Thing this though, 2GW of storage is still likely to be pricey, surely?

    2. The role of gas in the evening peak in April 2024 has been roughly halved compared to April 2021.

      Wow, this is a stat to remember

    1. An interesting heuristic I came up with is that we should install 1 kWh of batteries for every kilowattpeak (kWp) of solar panels. On average solar panels will output around 20% of their rated peak power worldwide (see table). A good rule of thumb is that you want a minimum of 5 hours of storage in the future renewable grid. Five times 20% is 100%, hence 1 kWh of batteries for 1 kWp of solar panels.

      This shows the Germany having only 11-12% capacity factor for solar? I assumed it was close to Spain's figure

    2. Assuming we get all that from renewable energy this would mean an average production of 10 TW (=terawatt). That’s roughly 1 kW per person by the way.

      the big assumption here is getting it all from renewables, surely?

    1. ECL launched its first modular hydrogen data center back in May, with a small deployment at its site in Mountain View, California. Each module supports 1MW, and can cool up to 75kW per rack.For the TerraSite, ECL said that it has three pipelines of hydrogen that will feed the facility, with an energy cost of 0.08-0.12/kWh.

      Presumably these are the costs of non-green hydrogen, as it's significantly more expensive.

      More context here: https://rtl.chrisadams.me.uk/2024/01/hydrogen-datacentres-is-this-legit/

    1. As the primus inter pares of the Mexican bourgeoisie, Slim has received preferential treatment from the government, which he has reciprocated through constant public appearances with the President. When it became clear that the collapse in 2020 of an elevated metro line that had killed twenty-six people was due to poor workmanship by Mr. Slim’s engineering firm, then-mayor of Mexico City Claudia Sheinbaum reached an agreement with the billionaire: the firm would rebuild the line at a cost of about $40 million dollars. The victims received a compensation of $20,000 to $290,000, depending on whether they had been wounded or killed. For less than $50 million dollars, Slim bought his innocence. 

      😬

    2. The movement’s new leader, Claudia Sheinbaum won the Presidency with 60 percent of the votes in early June. With a two-thirds majority in both houses of Congress, the Movement for National Regeneration (Morena) will have the power to completely rewrite the country’s constitutional compact. 

      Wow, i had no idea the majority was so huge

    1. To run a command in the project environment, use uv run. Alternatively the project environment can be activated as normal for a virtual environment.

      oh my, so much better for non python ppl - they dont need to carre about virtual envs

    1. Significantly, that represents 7.3 TWh of annual clean electricity output that currently does not exist on the US powergrid. Compare that to the annual output of the nation’s largest utility-scale solar plant, Solar Star in California, that generates about 1.6 TWh per year from 747 MW of capacity.

      This is different to the recent AWS nuclear deal

    1. Encouraging more research into sustainable manufacturing and implementing minimum requirements, such as using a certain percentage of renewable energy or investing in eco-friendly chemicals, could be initial steps. This approach mirrors the efforts observed in the implementation of the US CHIPS Act in the United States.

      Does the CHIPS Act have similar adders, and other strings attached to the funding?

    2. When it comes to measuring the ecological impact of chips during their operation in the end-product, it is difficult to come up with reliable and detailed data. One key challenge is that, according to the GHG Protocol guidance for scope 3, semiconductor manufacturers (both in front- and back-end manufacturing) do not need to report on their climate footprint, particularly with regard to emissions, as their products are classified as ‘intermediate products’ and are not directly sold as end-products

      Intermedia products don't show up in scope 3? That likely helps explain why there are so figures for making chips at all

    3. An average hyperscale data centre consumes 11 to 19 million litres of water every day, only half the consumption of front-end manufacturing.

      Fabs are even thirstier than similarly sized DCs

    4. Recently, numerous reports 131 have emphasised the significant electricity consumption in front-end manufacturing by drawing comparisons with homes, 132 cities 133 134 and even national energy usage levels. 135 According to our calculations, the electricity consumption of the European semiconductor industry in 2030 will be around 47.4 tWh (a steep increase from 10tWh in 2021), which is around half that of European data centres (98.5 tWh). 136 Electricity is the biggest single source of GHG emissions

      Decarbonisind electricity still helps here.

      This jump in electricty in manufacturing appears to be just scaling the number up by 4 because you have 4x as much manufacturing. Are there any economies of scale?

    5. For others, such as hydrogen (H), the picture looks different. The production of H is based on cryogenic processing, which means that the temperature of the gas stream needs to drop to approximately −85°C. 76 This consumes considerable energy and thus emits high indirect GHG emissions in production

      Could you not use electrolysis? Ah, the key thing here is thr energy to cool it down

    6. Fluorinated gases have a much higher GWP than CO2, and thus a high climate footprint, and account for 80%–90% of direct emissions in a fab. 41 They are reported under scope 1 of the GHG Protocol and are regulated in the European F-gas regulation that was updated in January 2024 and adopted on 7 February 2024. 42

      Fugitive emissions of one and 1% is still something that can have a significant climate impact because these are such utter bastards, GWP-wise

    7. The ecological impact of chemicals stems from three major sources
      1. creating the chemicals to use
      2. fugitive gases 3 PFAS that do not breakdown
    8. Fabs in regions with high water scarcity, such as Taiwan, emphasise water reuse and recycling, achieving rates of around 80%, while those in Europe typically have lower recycling rates (10%–14%).

      Taiwan is much more efficient with water because it is in an area with high water scarcity. I did not know that, and assumed it was quite water rich.

    9. If the goal of a 20% production share stated in the EU Chips Act is met, it is expected that the electricity consumption of the European semiconductor industry will be around 47.4 tWh in 2030, half that of European data centres (98.5 tWh).

      This is a wild stat.

    10. But this position is increasingly untenable. If the EU Chips Act achieves its goal of 20% global production by 2030, emissions could increase up to eightfold, surpassing those of other emission-heavy industries. The climate issue will therefore become ever more pressing.

      Will this follow the same route as the US with NEPA being sidelined?

    1. his week I spoke with Matilda Krieder, a researcher at the National Renewable Energy Laboratory, about a database she and her colleagues released this week showing how onshore and offshore wind developers use community benefit agreements

      A database of community benefit agreements for wind projects. Does such a thing exist for datacentres?

    1. The cost of AI use at any normal human scale doesn’t waste a problematic amount of energy. The problem is the astronomical cost of creating AI systems massive enough to ensure corporate dominance

      This is fantastic quote

  2. Sep 2024
    1. To GHG Protocol’s credit, the organization does ask companies to report location-based figures alongside their Rec-based figures. Despite that, no company includes both location-based and market-based metrics for all three subcategories of emissions in the bodies of their annual environmental reports.

      I don't think there is any concept of locationbased and market-based emissions in the other scopes right now - it's only a scope 2 thing.

    1. About a year ago, the Wall Street Journal reported that Microsoft's $10-a-month Github Copilot (which generates code and suggests changes  to the software you're building) loses the company on average $20-a-user-a-month, and in some cases costs Microsoft as much as $80-a-user-a-month. While it's possible that Microsoft could have found ways to make Github Copilot more efficient, this seriously suggests that Microsoft 365 Copilot loses money in much the same way, though generating code is a little more compute-intensive.

      Really? REALLY really?!?

    1. Residents rarely learn how data centers may affect their lives until it’s too late. Big tech operators are aggressively deploying nondisclosure agreements to force local officials, construction workers and others to keep these projects under wraps.For tech firms, the incentives to build more of these centers are immense: A McKinsey analysis projected the generative A.I. business could eventually be worth nearly $8 trillion worldwide. Tech companies don’t want to tip off rivals that might try to swoop in and steal a viable site for development. That’s part of the reason they so zealously enforce nondisclosure agreements. But it’s more than that — they also seem to want to avoid angering locals who might learn of the coming disruptions and protest zoning changes.

      This is an interesting angle I hadn't seen before. Because space is at a premium, infra providers are trying to hide their identity to avoid competitors swoop in. That AND the whole social licence to operate thing.

    1. In August, Germany’s Economy Ministry presented an “options paper” on the future of the country’s electricity market, proposing a mix of liberal and centralised elements of electricity markets, as is the case elsewhere in Europe. Among its proposals, the paper suggests creating a new “capacity market” that would pay generators a flat fee based on size, unlike the current liberal market that rewards them for the energy they sell.

      So they're not incentivised to burn gas, but we have the power available. What about storage serving the same goal?

    1. Importantly, receiving nations were required to deposit money equivalent to those products in their central banks. This rebuilt financial health in Europe and order forms for US companies for a generation.

      So basically the money was given to the nations, who then had to put it in their central banks, and then pay it out to US firms?

    1. Since then, thousands of people across the country, including military service members, firefighters, and citizens have launched lawsuits against PFAS manufacturers. 3M paid $10.3 billion to settle lawsuits over contaminated drinking water; DuPont paid more than $1.2 billion; Tyco Fire Products, the makers of firefighting foam, settled for $750 million. Altogether, the chemical industry has paid $11 billion to cover the costs of cleaning up ground and water contamination. But so far, no individuals—including the Cotters—have been compensated for the potential effects of exposure to carcinogenic PFAS chemicals. 

      Wow, I had no idea there were so many out court settlements paid around PFAS

    1. Nonetheless, LPG remains an expensive and imported fossil fuel. Indonesia is now drawing up plans for a large-scale deployment of electrified rice cookers and induction stoves.

      WAAAAAAT

  3. courses.cs.duke.edu courses.cs.duke.edu
    1. In short, rough consensuswas an apt description of this informal processin which a proposal must answer to criticisms,but need not be held up if supported by a vastmajority of the group

      Ah... rough consensus is not the same as total consensus

    2. The IAB increasingly served as thesteward of TCP/IP and the Internet, but had nolegal mandate or enforcement mechanisms. Inother words, IAB-backed protocols were defacto standards, whose status as standardsdepended on broad consensus and widespreadimplementation.

      So there was no enforcement power - just broad consensus

    1. For every euro spent on clean power in Europe over the 2022-40 period, 90 cents of grid investment will be required to achieve the EU’s climate ambitions. 

      woooooow!

    1. The GLEIF also produces an official mapping file, linking LEI records to the corresponding OpenCorporates records for the same legal entities (and this provides a route from users of OpenCorporates records to BIC and ISIN identifiers, which are also mapped to the LEI). This relationship is underpinned by both organisations having a data model where one-legal-entity is represented by one and only one record; where both organisations have public benefit missions; and where both are committed to non-proprietary identifiers.

      Oh neat - so this presumably means if you have an LEI, then you also can find the correspponding OpenCorporates ID, which would let you find the corporate grouping if it exists.

    1. Instituting nondiscrimination or common carrier obligations on compute providers operating key points of the stack

      This is a bit like Net Neutrality, but for compute

    2. The FTC has already outlined this principle in its recent Amazon Alexa case

      Reference this, it’s an interesting precedent

    3. The UK’s Competition and Markets Authority recently published its initial report on AI Foundation Models, which will be followed up with a subsequent workstream in early 2024

      Is it out yet?

    4. Across the Atlantic, compute power has been an important element in France’s national interest in building out its AI capabilities. Among other moves, France funded the creation of the Jean Zay supercomputer in 2019, operated by the Centre national de la recherche scientifique (CNRS). The computer was used to train BigScience’s BLOOM large-scale AI model

      Ah, so THATs where BLOOM was created. Weld this have played a role in the transparency too?

    5. Over 50 new semiconductor projects were announced worth $200 billion following the passage of the Act, according to the Semiconductor Industry Association.162 Among them is TSMC, which plans to make a $40 billion investment in a new facility in Phoenix, Arizona.163 This is particularly notable because it illustrates that market subsidies can function to exacerbate rather than ameliorate market concentration if not carefully calibrated: given the existing bottlenecks in chip fabrication, such investments can easily be captured by dominant players even if they introduce more geographical spread.164 Notably, the chips produced within TSMC’s new facility will still be sent back to Taiwan for packaging and assembly, subverting the hope of creating fully made-in-the-USA chips.165

      Do the European deals have similar issues with maintaining the same bottlenecks?

    6. A 25 percent investment tax credit for semiconductor manufacturing and related equipment

      There’s an ITC for manufacture now

    7. There are a few other pathways toward overcoming the compute barrier, significant among which is a degree of decentralized training. Federated learning could possibly be a way to achieve scale without centralization

      Ask Philipp about flower and how they get around the interconnect issue

    8. For instance, Hugging Face has partnered with AWS in a revenue-sharing agreement to allow developers using Hugging Face to use AWS’s compute and software

      Ahhh, I hadn’t realised the kickback here. It’s now much clearer to me how they make cash now

    9. Of particular note is that although TSMC is building a chip fabrication facility in Arizona under the US CHIPS Act, all of the chips fabricated at this facility will need to be sent to Taiwan for packaging, meaning that TMSC’s US chip production process will still require a global supply chain network

      Ah… this is what packaging was referring to before

    10. But this software dominance is also slowly being challenged. OpenAI developed Triton, an open-source software solution that it claims is more efficient than CUDA. Triton can only be used on Nvidia’s GPUs as of now.97 Meta developed PyTorch and then spun off the project as an open-source initiative housed under the Linux Foundation (still financially supported by Meta), and its new version performs relatively well on Nvidia’s A100.98 The benefit of PyTorch is that it can be used across a range of hardware, but on the flip side, it is not optimized for any particular chip

      Ah… so THATs what purpose PyTirch serves. PyTorch is to CUDA what OCP is to proprietary hyper scale server design

    11. Cerebras differentiates itself by creating a large wafer with logic, memory, and interconnect all on-chip. This leads to a bandwidth that is 10,000 times more than the A100. However, this system costs $2–3 million as compared to $10,000 for the A100, and is only available in a set of 15. Having said that, it is likely that Cerebras is cost efficient for makers of large-scale AI models

      Does this help get around the need for interconnect enough to avoid needing such large hyper scale buildings?

    12. Moreover, its proprietary CUDA compiling software is the most well known to AI developers, which further encourages the use of Nvidia hardware as other chips require either more extensive programming or more specialized knowledge.

      It’s good that this is so explicitly called out as a bottleneck

    13. While FLOP/s grew by more than 6 times between Nvidia’s A100 chips and its latest H100 chips, memory bandwidth (interconnect) only grew by 1.65 times.82 Apart from practical and technological constraints, there is an energy cost to memory bandwidth, with a significant portion of a chip’s energy usage being attributed to interconnect. Overall, this means that interconnect is an important current constraint to the growth of computational power

      This is nicely demonstrated by that recent off package /on package power draw differential shown at hot chips 2024

    14. An important technological constraint with current memory technology is that while logic has only one goal to optimize for (maximizing the number of transistors on a chip), memory is trying to optimize for multiple goals (capacity, bandwidth, and latency).75 Latency has usually lagged behind the other two

      This is a really nice insight, presented succinctly

    15. This translates directly into cost increases: SOTA AI chips are 10–1,000 times more cost-effective than SOTA CPUs, and 33 times more cost-effective than trailing node AI chips. Thus a large AI model built on trailing node AI chips would be at least 33 times more expensive than models using leading node AI chips.61

      This is a really good example of demonstrating why people invest in new hardware

    16. in the 1990s and 2000s, the US government reduced its level of investment as interventionist policy fell out of vogue. This set the stage for Intel’s decline relative to firms like Taiwan Semiconductor Manufacturing Company, now the world’s dominant chip fabricator; and ASML, a Dutch company that is the sole manufacturer of the equipment needed to build state-of-the-art chips.

      I hadn’t realised this significance of the loss of subsidy in offshoring

    17. This trend has borne out historically: before the deep learning era, the amount of compute used by AI models doubled in about 21.3 months; since deep learning as a paradigm took hold around 2010, the amount of compute used by models started doubling in only 5.7 months12. Since 2015 however, trends in compute growth have split into two: the amount of compute used in large-scale models has been doubling in roughly 9.9 months, while the amount of compute used in regular-scale models has been doubling in only about 5.7 months

      If something is doubling faster in small models, how long before they Ive take the larger models? I can’t do the maths in my head

  4. Aug 2024
    1. Third, relatedly, we need to excite people with big ideas that are congruent with the crisis, and that simultaneously speak to people’s deep economic and employment anxieties and the cost of living crisis.We need billions of dollars more spent on transformative climate infrastructure that will employ tens of thousands of people.Rather than trying to incentivize heat pumps with inadequate rebates, let’s just make them free! (As PEI does for households with incomes under $100,000.)Let’s talk about free public transit, and huge subsidies for e-bikes, to liberate people from punishing transportation expenses. And let’s propose paying for a chunk of all that with wealth and windfall profits taxes (a recent Abacus survey found increasing taxes on the richest 1% to be a massive vote-winner), and suing the corporations that got us into this mess (as California is doing).These represent transformative policies that tackle multiple crises at once and bolster solidarity.

      Wow. Bold. I love the sound of this, and yet I’m reflexively hearing myself up for the “how do we pay for it” response

  5. powering-the-planet.ghost.io powering-the-planet.ghost.io
    1. Who will pick up the $280 billion bill?  So far, it is the U.S. public. As I argued last week, we already effectively own these liabilities. So, how do we get the money to pay for them? 

      Wow, this is the cost of cleanup? how does this change the cost of a well if you have to pay a reasonable cost for the clean up at the end of the life of the well?

    2. “It also involves cleaning up more than 100 years of industrial activity and if you look at all these producing fields in the Weald and in the east midlands, it’s a bit like the North Sea.

      wait. The UK used to produce oi?

    1. Utilization effectiveness

      This is a new term to me. I think it's bit like PUE in that you want the number to be as close to 1 as possible (i.e. for 1 server's worth of utilisation, you want to have provision 1 server, not 5)

      So, I think this is the term you might use to talk about how a hyperscaler datacentre might be full of individually efficient servers, but only have 5% of them serving workloads - the rest is just spare capacity ready to sell, or give away as credits

    1. Aurora Serverless packs a number of database instances onto a single physical machine (each isolated inside its own virtual machine using AWS’s Nitro Hypervisor). As these databases shrink and grow, resources like CPU and memory are reclaimed from shrinking workloads, pooled in the hypervisor, and given to growing workloads

      Oh, wow, so the workload themselves are dynamically scaling up and down "vertically" as opposed to "horizontally" - I think this is a bit like dynamically changing the size of Docker containers that are running the databases while they're running

    1. We weren’t satisfied that only a relatively small number of volumes and customers had better performance. We wanted to bring the benefits of SSDs to everyone. This is an area where scale makes things difficult. We had a large fleet of thousands of storage servers running millions of non-provisioned IOPS customer volumes. Some of those same volumes still exist today. It would be an expensive proposition to throw away all of that hardware and replace it

      They upgraded all of the servers rather than replacing them. That must’ve been a lot of work

    2. We didn’t have to worry much about the network getting in the way since end-to-end EBS latency was dominated by HDDs and measured in the 10s of milliseconds. Even our early data center networks were beefy enough to handle our user’s latency and throughput expectations. The addition of 10s of microseconds on the network was a small fraction of overall latency.

      Disks were so slow that the network didn’t really matter, basicall

    3. These platters have tracks that contain the data. Relative to the size of a track (<100 nanometers), there’s a large arm that swings back and forth to find the right track to read or write your data

      Oh, so you're reading from a track like a record. It's much more like a record player than I thought

    1. Clean Energy Marshall Plan has the makings of a compelling pitch to U.S. domestic audiences: investing in the clean energy transition abroad will benefit businesses and workers at home. Evidence of that effect is already easy to find. The clean investment boom is turning novel technologies into market mainstays: emerging technologies such as hydrogen power and carbon capture now each receive more investment than wind

      This feels like an unfortunate example. Wind is one of the foundational technologies we would need for hydrogen to work, and CCS is still boondoggle

    2. The Green Climate Fund, the sole multilateral public financial institution devoted to addressing climate change, could follow this approach, too. Almost 15 years after it was founded, the GCF has disbursed only 20 percent of the funding it has received

      Mind blown.why?

    3. To complement the Clean Energy Finance Authority, the tariff could be lowered in exchange for foreign procurement of clean energy technologies or of clean products made in the United States

      So the CBAM as the stick, and the CEFA as the carrot

    4. A carbon-based tariff, or a carbon border adjustment, should further motivate climate action by exempting countries that are hitting their nationally determined goals under the 2016 Paris climate agreement or those that fall below certain income and emission thresholds

      Ah, they’ve explicitly said it here then.

    5. To accomplish this, the United States must use expanded, stronger, and smarter trade authorities. For example, Washington should build into its tariffs on imported goods an assessment of how much carbon was used to produce them. Tariffs should be determined by the emission intensity of the trading partner’s entire industry, rather than company by company, to avoid “resource reshuffling,” whereby countries try to dodge penalties by limiting their exports to only products manufactured with clean energy instead of reducing their emissions overall. These tariffs should be aimed at all countries, but given its current production practices, China would be hit the hardest.

      So a CBAM?

    6. the term “rare-earth minerals” is a misnomer: these elements are abundant and geographically dispersed. Eighty percent of the world’s lithium reserves, 66 percent of its nickel reserves, and 50 percent of its copper reserves are in democracies. Eighty percent of oil reserves, by contrast, are in OPEC countries, nearly all of which are autocracies

      Useful stat

    1. Companies using these services cannot learn by using these digital technologies because they pay only for use, not for access to the intangibles on the cloud.

      This really lays out why some models have the clause for not training your own model. it's to avoid the creation of 'property' that a customer no longer needs to rent

    1. But the climate impact of data centres could be significantly worse than this. Because of the huge strain data centres are placing on power grids, EirGrid placed a de facto moratorium on new connections around Dublin, causing many to seek a connection to the natural gas network to generate electricity on-site.

      Wow, it's in Ireland too?!

    1. The taxonomy is where the rules and data definitions are organised. It is comprisedof a set of elements (i.e., Key Performance Indicatorsand narratives) and all the presentation,calculation and standard logic rules that are in effect. Once created, the XBRL taxonomy is made public as an open sourcefileon the internet. Then, for a specific firm, software can be used to create an XBRL instance (the report itself), containing the specific facts and figures for a certain period. The XBRL instance can be checked against the taxonomy by all parties (reporting entity, a regulator, or even the public) in order toguarantee its data qualityand reliability, as the taxonomy contains data quality checks that any XBRL engine canvalidate

      This is actually a handy description

    1. The disclosure required by paragraph 35 shall include the total energy consumption in MWh related to own operations disaggregated by: (a) total energy consumption from fossil sources ( 40 ) ; (b) total energy consumption from nuclear sources; (c) total energy consumption from renewable sources disaggregated by:

      This is the disclosure requirement. It IS subject to materiality, so firms only need to report it if the figure is deemed material to their operations.

  6. Jul 2024
    1. DPI can be seen as part of a broader effort to reinvent our relationship to the internet—and, more generally, our digital ecosystem. A large part of its normative appeal stems from the “P” in the acronym: the sense that core functionality on the internet (i.e., identity, payments, data exchange) should not merely serve private ends but rather be reimagined as a set of public goods

      Who needs Venmo when you have this?

    1. This is a good policy, as unbundled REC purchases have a bad reputation when they are used across regions and countries as a cheap substitute for actual investments. But here they are being purchased to match investments in new generating capacity.

      This is an interesting framing. I haven't seen it present like this before. Is there a way to "re-bundle" them?

    1. To use an extreme and blunt example, if an AI were tasked to stop global warming it might suggest to simply remove all the humans; that might get the job done (solve the task) but not in a way that is aligned with the intent (solve climate change while preserving human life).

      Summarising the alignment problem

    1. In this project, the data workers serve as community researchers. Every community researcher works for two to four months on their inquiry and is compensated for all their working hours. We collaborate with data workers globally. A decisive sampling criterion is that these are data workers who are already organized in workers councils, unions, communities, or advocacy organizations.

      Coudl this be applied to sustainability working groups, Green Teams, and related ERGs?

    1. On the matter of regulation and legislation, the UK needs to adopt similar legislation to that already in place in the EU namely, the Taxonomy Climate Delegated Act, including the Assessment Framework, the Energy Efficiency Directive and its associated data centre delegated act to collect energy and other environmental data, this could be acheived by simply amending the existing Climate Change Agreement, extending the provisions to all data centres located in the UK and reducing the threshold for compliance reporting to 100kW.

      Interesting - he's 100KW is the value that would cover telecoms as well, I think

    1. Under the report’s net zero scenario, gas use would peak around the middle of this decade before halving by 2050, compared with 2022 levels. But the current trajectory suggests gas demand will continue to grow throughout the forecast, expanding by about a fifth by 2050.In the scenarios, demand for liquefied natural gas, which is cooled to be transported on ships, climbs by 40% and 30% above 2022 levels respectively.

      fall in oil, growth in gas

    2. BP has predicted that the world’s demand for oil will peak next year, bringing an end to rising global carbon emissions by the mid-2020s amid a surge in wind and solar power.

      Oil company says oil demand will peak next year

    1. Putting the various figures together shows that, far from the modest 29% year-on-year increase in the incomplete NBS data, there was a record 78% rise in solar generation in May 2024.

      This was compared to may 2023 - so a nearly 80% jump in a single year, for China (!)

    2. Clean energy generated a record-high 44% of China’s electricity in May 2024, pushing coal’s share down to a record low of 53%, despite continued growth in demand. The new analysis for Carbon Brief, based on official figures and other data that only became available last week, reveals the true scale of the drop in coal’s share of the mix. Coal lost seven percentage points compared with May 2023, when it accounted for 60% of generation in China.

      Next year coal will be below 50% in China if the pace is kept

    1. They present a significant policy opportunity supported by the country’s ongoing efforts to develop the green bond market and can catalyze promoting green and high-quality development, creating jobs, and delivering environmental benefits.The Greenpeace East Asia report analyzed over 8,000 projects covered by newly issued provincial and municipal government bonds in 2021 as a sample and found that one in five could have been issued as such Green and Sustainable Municipal Bonds.2

      so basically, assuming there is a market for green bonds in the private sector, there are loads of 'bondable' projects

    1. “For our customer base, there's a lot of folks who say ‘I don't actually need the newest B100 or B200,’” Erb says. “They don’t need to train the models in four days, they’re okay doing it in two weeks for a quarter of the cost. We actually still have Maxwell-generation GPUs [first released in 2014] that are running in production. That said, we are investing heavily in the next generation.”

      What would the energy cost be of the two compared like this?

    1. As solar is displacing traditional assets, such as gas power plants, we are observing a relatively sharp increase in the price of some of the power reserves, which happens in the opposite direction than the prices on the day-ahead markets.

      This implies a massive rise in value for being able to provide reserve dispatchable power, like from batteries

    1. Germany's coalition government is set to overhaul the way renewable energy is subsidised so that power producers would get one-off support for their investment costs instead of a guaranteed price for power they produce, a finance ministry document showed on Friday.

      So this would either:

      a) reduce the need to borrow so much cash from risk averse bankers b) not really help with the price volatility / merchant risk thing that makes it hard to get finance

    1. Mark Boost, CEO of UK-based cloud company Civo is similarly disappointed in the outcome.Boost said the deal is "not good news" for the cloud industry, and added several important questions still need to be answered."We need to know more about how the process of compensation will work," he said. "Will all cloud providers in Europe be compensated, or just CISPE members? Is this a process that will be arbitrated by Microsoft? Where are the regulators in this?"Boost added that the deal will benefit CISPE members only in the short term, but that the cloud industry and its customers will pay the price in the long-term.

      This is a bit like how AWS will share sustainabilty data under NDA - works for that provider, but not everyone else.

    1. However, EFRAG's move to publish a draft for consultation at the same time as the ESRS at the end of January 2024, which is aimed at SMEs not covered by the CSRD, came as a surprise: With the so-called Voluntary ESRS for Non-Listed Small- and Medium-Sized Enterprises the "VSME ESRS", EFRAG is addressing all those companies that are not covered by the CSRD and thus the aforementioned concretisations (i.e., ESRS and ESRS LSME), but which nevertheless wish to take similar measures.

      So basically this some guidance on what companies who aren't covered by the CSRD ought to cover to demonstrate following the intent of the law

    1. To start, titanium ore is heated to 1,800 degrees Fahrenheit and reacted with chlorine gas and carbon-rich petroleum ​“coke.” This step yields a liquid chemical, titanium tetrachloride, and also produces carbon dioxide as a byproduct (similar to how blast furnaces for ironmaking release CO2).

      I didn't realise Titanium relied on electrolysis like alumnium too

    1. There also exist well-known vulnerabilities for eBPF programs, which can allow attacksto break container isolation [13] and execute malicious code inthe kernel [22]. Since Wattmeter is built on top of eBPF and ac-cesses RAPL information, only privileged users should be allowedto access it.

      So, vulnerable to platypus attack?

    2. Figure 3 shows the effect of EFS on the CPU and energy sharesbetween the two processes

      it shows them using about the same power instead of the visible difference on the charts

    3. Another framework for implementing custom scheduling poli-cies without direct kernel modification is Meta’ssched_ext.sched_extis a Linux kernel patch that proposes a new abstract schedulerclass, which can be instantiated with scheduling programs in eBPFat runtime. There are ongoing discussion of upstreamingsched_extinto Linux. We have chosen to implement the scheduling poli-cies proposed in this paper with ghOSt, but it is also possible toimplement them withsched_ext

      Wow, it might actually be upstreamed into linux proper?

    1. He added that private companies, which will have a significant role in the transition, want to see policy certainty enhanced in the months ahead. The group is awaiting the finalisation of the Electricity Regulation Amendment Bill, which promises to open up the electricity market and put an end to Eskom’s longstanding monopoly, and the Integrated Resource Plan.

      So, a market with more private generation firms, I'm guessing

    1. consumes 19% less energy per event in high performance mode

      in high power mode, it's more efficient for the amount of power it uses?/

    2. %. This shows, that the impact of load shaping heavily depends on the power proportionality53 of the underlying hardware, and that it is not a reasonable measure per se.

      Ah, so assumptions about physical hardware can't be blindly applied to cloud. Assuming the TEADS model is accurate

    3. Although solar forecasts are not very promising this time, it again permits to discharge to 30%, as the carbon intensity during the next day is expected to be especially low. Instead of drawing carbon-intensive grid energy at night, the demand is thereby shifted to the next morning where the batteries are charged to 60% using cleaner energy.

      In this case local generation is low, but the grid is relatively clean (maybe it's v windy, not sunny), so it's ok to run down the local store of greener energy in the battery

    4. Although the carbon-aware experiment uses 2.4 % more energy than the baseline (which is because not all power modes have the same energy-efficiency), its associated carbon emissions through grid power consumption are 35.9 % lower. In the following, we will briefly analyze the two experiments and to demonstrate how our integration enables research and development of carbon-aware applications.

      How much power draw compared to the battery does this set up have? 32,000 mAh would be how long at max power draw for a Pi?

    5. The control unit adaptively adjusts the battery's minimum state of charge and grid charge rate over time. In particular, in case of promising forecasts for solar power production or low carbon intensity, it is able to temporarily deplete the battery to 30%.

      So, PUT ing to the /soc end point with target charge and a new C value

    6. This is especially the case when not testing virtualized applications on powerful hardware but embedded systems that often only run only one energy-hungry process at a time. In these systems, load shaping is likely rather performed on a device level, for example, through DVFS.

      Also with exclusive use of a GPU server, right?

    7. Therefore, if applications under test are deployed on physical nodes like single-board computers, it is recommended to use dedicated hardware for measuring the power usage of these devices. For example, in our experimental testbed we monitor the current and voltage of a RaspberryPi 4b with a USB to USB measuring device equipped with an INA219 DC current sensor.

      What kit has a INA219 DC current sensor these days? How can I buy one?

    8. Note: The physical node's power usage is controlled via DVFS, while the virtual node uses rate limiting on the executed process.

      Ah, two strategies for "Change the speed" of the three approaches I list in this post

    1. “These claims assume that a company that pollutes more now should be able to pollute into the future. This means Global North companies will continue to inequitably dominate use of our remaining carbon budget. These findings should raise real questions for any bodies that claim to set standards for voluntary corporate climate targets,” David Tong, Global Industry Campaign Manager for Oil Change International, said.
    2. A major blindspot is the fact that SBTi does not take into account new companies with their share of emissions coming into existence in the future. In SBTi’s framework, existing companies are allocated a share of the carbon budget without leaving any room for new players, some of whom might be more efficient or even working in the decarbonisation space like solar technologies. This further entrenches fossil fuel developments by existing companies and also raises questions about equity.

      New firms could be better than incumbents as emissions are ring fenced for incumbents

    3. Net-zero corporate pledges are voluntary, which means they can be reeled back in as quickly as they are announced.Shell scrapped its emission reduction target for 2035 when it sought to grow its gas business, for example, and BP walked back on some of its climate commitments when profits hit a record high. An Oil Change International assessment of the climate plans of eight oil majors—Chevron, ExxonMobil, Shell, TotalEnergies, BP, Eni, Equinor, and ConocoPhillips—released this year found that all eight continue to drive fossil fuel expansion and six have explicit goals to grow their total production volume this decade
    4. Not all do so and there is no agreement on what a fair share is, but by claiming an individual target is Paris compliant is implicitly making an ethical claim that this represents your fair share of the global response, without saying how it is fair,” she said.
    5. The Climate Policy paper points to issues that arise when individual countries or companies link their efforts to mitigate climate change to Paris Agreement goals. If such linking is indeed necessary, the authors say that assumptions about time scale, spatial scale and equity must be included in the analysis and presented transparently.The paper is “a welcome contribution” because it asks countries to center mitigation claims in a context of national contributions to global fair shares of the climate response, said Kate Dooley, a research fellow in the School of Geography, Earth and Atmospheric Sciences at the University of Melbourne. At present, many wealthy countries have exceeded their fair share of the carbon budget.
    6. The distribution of emissions over time is not just a question of cumulative global emissions—it’s also a matter of equity. Rich, industrialized countries have already claimed a disproportionate share of the carbon budget that has brought us to around 1.2°C warming today. The U.S. has emitted about 25 percent of cumulative emissions; Europe, meanwhile, has emitted around 20 percent.

      Good point

  7. Jun 2024
    1. In fact, capturing CO2 from ethanol production is so easy that ethanol production is the biggest source of purified CO2 in the US, and the second biggest source in Europe. This is where we source the CO2 that we use in fizzy drinks, food production, fire extinguishers, etc.

      Interesting. I thought it was Steam Reformed Methane

    2. In fact, capturing CO2 from ethanol production is so easy that ethanol production is the biggest source of purified CO2 in the US, and the second biggest source in Europe. This is where we source the CO2 that we use in fizzy drinks, food production, fire extinguishers, etc.

      Interesting. I thought it was Steam Reformed Methane

    1. it did so by taking some short cuts. Android itself was an acquisition in 2005 (for $50 million, with a keyboard interface) and, in response to the iPhone, a new touch user interface was developed

      I totally forgot that android was an acquisition

    1. In their annual financial report, companies will have to disclose sustainability information in the management report. The details of the sustainability information to be published are set out in Commission Delegated Regulation (EU) 2023/2772 of 31 July 2023, to which the French Decree 2023-1394

      So, 2023-1394 is the thing to keep an eye on

    1. That’s no small task: In 2022, SAP spent approximately €7.2 billion in purchases from more than 13,000 suppliers worldwide, its annual report shows; 30% of that being on cloud services – more on that below.

      7.2bn - nearly 2bn is spend on cloud, all by themselves?

    2. He is referring to a mechanism SAP first tested in 2016 across 10 countries. This prices CO2 (from €40 to €400 per ton; depending on the flight and its distance) to release money for sustainability initiatives, as SAP ramps up not just its own internal decarbonisation programme (it aims to reach net zero by 2030) but continues to build out a suite of sustainability products for its expansive customer base.

      SAP have been using internal carbon pricing, for nearly ten years, and they are sophisticated enough to have different prices even within aviation

    1. As 45% of last year’s record solar additions were distributed generation, the exclusion of small solar installations is affecting these numbers a lot more than it used to.

      Wow, nearly half of China's solar addition is distributed? How is it funded?

    1. If there was one universal piece of advice I had for marketers seeking to broadly improve their organic search rankings and traffic, it would be: “Build a notable, popular, well-recognized brand in your space

      Recognisable brand, over content?

    2. A module on “Good Quality Travel Sites” would lead reasonable readers to conclude that a whitelist exists for Google in the travel sector (unclear if this is exclusively for Google’s “Travel” search tab, or web search more broadly). References in several places to flags for “isCovidLocalAuthority” and “isElectionAuthority” further suggests that Google is whitelisting particular domains that are appropriate to show for highly controversial of potentially problematic queries. 

      We know they whitelist electrion sites. Would they do this with climate science

  8. May 2024
    1. Our study suggests the transformation pathway will have three main phases.

      peaking (slowing down fossil deployment and ramping up clean gen), then energy revolution (bulk transition), then consolidation (clean up)

    2. Meanwhile, gas does not play a significant role in the power sector in our scenarios, as solar and wind can provide cheaper electricity while existing coal power plants

      This is a surprise. I had assumed China might shoot for gas in as a stop gap too, but it looks like they're skipping it

    3. The upper panel in the figure shows the installed capacity of coal power plants and the lower panel their electricity production from 2021 to 2060.

      Wow, China has more than a terawatt of coal generation in 2024

    1. What is happening in China with electric vehicles is pretty stunning. China is the world’s largest auto market by far — in 2022 China sold 26.8 million vehicles, the U.S. sold 13.8 million and Japan was third with 4.3 million.

      Holy shit.

      If EVs are making half the cars sold in China, then are more EVs are being sold in China than cars and trucks in the USA ?

    1. There is mounting evidence that the demand for imported cooking oil in the UK and Europe is being met with virgin palm oil that has been fraudulently passed off as waste. This would cancel out the fuel’s emissions savings, due to the land clearances for oil palm plantations.

      YIIIIIKES

    1. To produce the 12,750 MWh that will be generated by the data center at full capacity, 2660 t of pellets would have to be burned per year

      This suggests that SIg is buying the heat, which is what makes the whole thing economically workable

    2. Instead of wasting the 45°C hot air from equipment and servers in the atmosphere, the new data center will inject this flow into air-to-water heat pumps; these will raise the temperature from 45°C to 67-82°C in order to adapt to the current requirements of the district heating installations of Services Industriels de Genève (SIG).

      This suggests they are just complying with the law now? Or are they going beyond what is required?

    1. First, it will require the operators of regional grids across the country to forecast their region’s transmission needs a full 20 years into the future, develop plans that take those forecasts into account, and update those plans every five years. In practice, this should mean a more robust consideration of new wind and solar options, as well as greater adherence to the net-zero emissions targets set by many U.S. states.

      how far ahead did they need to forecast before?

    1. Over the past three years, battery storage capacity on the nation’s grids has grown tenfold, to 16,000 megawatts. This year, it is expected to nearly double again, with the biggest growth in Texas, California and Arizona.

      10x in 3 years?!

  9. Apr 2024
    1. The critical sixth key was the contest key: Bernie Sanders’s contest against Clinton. It was an open seat so you lost the incumbency key. The Democrats had done poorly in 2014 so you lost that key. There was no big domestic accomplishment following the Affordable Care Act in the previous term, and no big foreign policy splashy success following the killing of Bin Laden in the first term, so there were

      Bernie sanders was the missing key for the Hillary loss then?

    1. Now, the United Nations has taken a first step toward filling in these data gaps with the latest installment of its periodic report on e-waste around the world. Released last month, the new Global E-Waste Monitor shows the staggering scale of the e-waste crisis, which reached a new record in 2022 when the world threw out 62 million metric tons of electronics. And for the first time, the report includes a detailed breakdown of the metals present in our electronic garbage, and how often they are being recycled.
    1. Following an attributional approach, and with the assumptions retained, the Iroco study evaluates the greenhouse gas emissions attributable to one week's use of a mailbox at 63.2 gCO2eq.

      So about 3.5 kg per year

    1. However, corporate governance in the telecoms sector is notconducive to the kind of radical decisions that separationrequires. Decision makers and boards are incentivised onmetrics that require short term continuity, not disruption

      So to make this argument in cloud/AI, you need to make the argument that firms are too conservative in terms of corporate structure, and as a result are missing out on greater value coming from two individually more valuable entities - something that the org or compensation structure is not set up for

    2. Rural broadband in Europe is increasingly deployed in aneutral host model with a single fibre networkconnecting homes and multiple service providersdelivering services over that network.

      This is like the unbundled approach already then

    3. as pure infrastructure players borrow atlower rates and expect lower returns, they will deployinfrastructure where vertically integrated players won’t,and with no (or less) public subsidies;

      Infra with more patient capital isn't chasing such high risk returns like getting everyone to buy into AI

    4. We have one leading example of voluntary structural separationresulting in increased valuation of the separated entities -Telecom NZ, in 2011. Post de-merger, after years of flat marketvaluation, both Chorus (infrastructure) and Spark (services) sawshare prices rise. Between 2015 and 2023, the combined market1 Tower companies (or towercos) are infrastructure companies that managemobile towers for multiple mobile network operators.capitalisation of Chorus and Spark grew by 150%, whereas thatof European and US vertically integrated telcos2 grew only by15%.The plan also delivered full FTTH coverage to 87% of NewZealand homes with 72% adoption.

      So breaking them made led to greater share price raises AND better service in New Zealand

    Annotators

    1. So should we collectively accept a radical shift in our policy and regulatory framework in order to ensure that the beacon of the European economy can deploy infrastructure that Sweden, Denmark, Spain, France, and many of the smaller European countries started successfully deploying long ago? This, in a nutshell, is why the Commission never talks about Germany.

      This appears to be making the argument that rather than admit Germany has underinvestment compared to the rest of the EU, it's easier to change all the entire EU telecom policy so vertically integrated telcos can make a better return.

    1. The article confirms many of my suspicions — that, as The Information wrote, "other software companies that have touted generative AI as a boon to enterprises are still waiting for revenue to emerge," citing the example of professional services firm KPMG buying 47,000 subscriptions to Microsoft's co-pilot AI "at a significant discount on Copilot's $30 per seat per month sticker price." Confusingly, KPMG bought these subscriptions despite not having gauged how much value its employees actually get out of the software, but rather to "be familiar with any AI-related questions its customers might have."

      This seems like spending 17 million a year to help them sell consulting about what AI.

    1. It's important to understand these complexities because they are currently being flattened not just by corrupt government officials in Global South countries or fossil fuel executives in Texas, but also by a whole ecosystem of pundits like Jordan Peterson, Michael Shellenberger, Alex Epstein and even Joe Rogan, who use the idea that fossil fuel development will solve poverty in Africa as justification for continuing fossil fuel's dominance in the world. In fact, fossil fuel development hasn't even solved energy poverty in the African countries that have embraced it; Nigeria has the continent's largest and oldest fossil fuel industry and yet still has the world's lowest energy access rates.

      Wow, I had no idea that energy access was so low in like this.

    1. To plug the financing hole Intel has had to rely on a wide variety of capital sources to fund everything: traditional debt financing, government support, and even more creative financial engineering schemes like the Brookfield fab deal to find a way to pay for everything.

      thisis what? around 60bn of government subsidy?

  10. Mar 2024
    1. As noted by researchers at the University of Oxford and the Mercator Research Institute on Global Commons and Climate Change, carbon-intensive sectors like electricity generation actually contribute relatively little to the world economy, compared to high-value but lower-emitting sectors like IT, real estate, and social work. 

      what? the industries below RELY on electricity!

    1. The above report is an underestimation of datacenter power demand, but there are plenty of overestimates too

      Wow, so semi-analysis is seeing the IEA estimates as a low ball? I did not see that coming

  11. Feb 2024
    1. The FTC, of which Khan is now chair, contends that Amazon has found a way to push up prices after all, without losing shoppers. It does this, the FTC argues, by imposing ever-higher fees on third-party sellers, who have no choice but to pass these costs on to shoppers by raising their prices. If those businesses try to sell more cheaply elsewhere, Amazon throttles their sales on its platform—which, thanks to the central place in e-commerce that Amazon has built up over the years, can be a death sentence for these businesses. So, according to the FTC, sellers generally absorb the high fees by inflating their prices not just on Amazon but across the web—precluding the price competition that could loosen Amazon’s grip on e-commerce.

      This sounds like the "favoured nation status" from Cory's book

    1. The disclosure required by paragraph 33 shall include the total energy consumption in MWhrelated to own operations as follows
    1. and superconductors will at some point do to conventional power cables what fibre-optics did to conventional telephone cables.

      ? I assume this is referring to stuff like graphene.

  12. Jan 2024
    1. Because we use systemd for most of our service management, these stdout/stderr streams are generally piped into systemd-journald which handles the local machine logs. With its RateLimitBurst and RateLimitInterval configurations, this gives us a simple knob to control the output of any given service on a machine. This has given our logging pipeline the colloquial name of the “journal pipeline”, however as we will see, our pipeline has expanded far beyond just journald logs.

      I did not expect to see journald being used as the basic building block

  13. Dec 2023
    1. Overall, Iceland produces approximately 20 terawatt hours of electricity per year.

      About the same as Google?

    2. Landsvirkjun built Kárahnjúkar despite protests, it started operating in 2009. But the controversy changed Iceland's energy policy. A fourth aluminium smelter was never finished, and in an announcement made in 2022, Landsvirkjun writes that it "will for the time being not focus on attracting new large-scale customers in the metal industry or other industrial commodities."

      I did now there was push back even power in Iceland

    1. Meta previously bet on CPUs and its own in-house chips to handle both traditional workloads and AI ones. But as AI usage boomed, CPUs have been unable to keep up, while Meta's chip effort initially fizzled.

      I did not know they tried to make their own chips

    1. It’s past time for the IT and BPO Services industry to jump to a new S-curve driven by technology arbitrage if they wish to get back to another season of hockey stick growth:

      Yikes. So basically, time to fire the people we outsourced to, and replace them with ChatGPT ?

    1. wrote about this in 2012 in a book called Liars and Outliers. I wrote about four systems for enabling trust: our innate morals, concern about our reputations, the laws we live under, and security technologies that constrain our behavior. I wrote about how the first two are more informal than the last two. And how the last two scale better, and allow for larger and more complex societies. They enable cooperation amongst strangers.

      Morals and reputation

      Laws and tech

    1. They were national strategic programs. The strategic programs were aligned with nuclear weapons programs. The government picked and enforced a single design for all of the reactors. The reactors were GW-scale due to thermal efficiencies required for cost effectiveness. The government ran human resourcing. The programs ran for 20 or 30 years. They built dozens of nuclear reactors to maintain the teams and momentum and to share lessons learned.

      These are 5he seven conditions necessary for nuclear to work, asked on the historical evidence

    2. A key point to remember about the US DOE is that 55% of its budget is related to commercial nuclear generation. The other 45% covers dams, geothermal, wind, solar, tidal, wave, biomass and biofuel energy.

      An! Numbers here! This is super helpful!

  14. Nov 2023
    1. A second MoU on gas export partnership was agreed between Riverside LNG of Nigeria and Germany's Johannes Schuetze Energy Import AG. Under the accord, Nigeria will supply 850,000 tons of natural gas to Germany annually which is expected to rise to 1.2 million. The first deliveries will be in 2026, Ngelale said.The deal will help process about 50 million cubic feet per day of natural gas that otherwise would have flared.

      ok, so instead of being flared for no money, it's being bought by German to burn instead.

    1. Volkswagen teamed up with ClimatePartner GmbH to generate offsets so it can be “independent from the market where you can’t control what’s effective and what’s not,” said Esra Aydin, a spokesperson for the automaker. But the credits didn’t materialize in time last year and it ended up going back to the voluntary market where cheap renewable-energy credits were the only ones that worked with its budget.

      "Worked with its budget"

    2. In other words, the companies decided some offsets weren’t good enough to meet their own climate goals — and yet have continued selling them to customers as a feel-good solution.

      That's a quite a quote

    1. In contrast with the precipitous fallings-off of platforms like Myspace and LiveJournal, the decays of which have been clear and irreversible, Tumblr has kept defying expectations and stayed somewhat thriving against all odds. That is, in part, thanks to the guidance and guardianship of Automattic, which acquired Tumblr for the bargain-bin price of $3 million.

      Holy balls. Only three mil?

    1. The two corporations aren’t directly buying fuel from World Energy. Instead, they’ll purchase certificates representing SAF that gets pumped into the larger supply chain — then count the associated carbon reductions toward their sustainability goals. Microsoft agreed to buy certificates representing 43.7 million gallons of SAF over a 10-year period, while DHL signed a contract for 177 million gallons over seven years.

      These are basically like EACs for fuel!

    1. Provided that reliability and supply improve, the grid could become the optimal solution to provide almost 60% of people with access to electricity in each scenario.In the AC, Nigeria achieves universal access by stepping up efforts to provide off-grid solutions to those populations that live far from a grid.

      Wow, really that much off grid generation?

    1. However, some have criticised the CIPP for providing market-rate loans rather than special financing schemes, which will lead to high costs for Indonesia and could deter other countries from accepting similar deals in the future. 

      Ouch! What's the interest rate in Indonesia compared to North America for energy infra?

    1. Here’s another problem: from 2020 to 2022, shopify claims around 31,000 tonnes of carbon removal. But according to the CDR.FYI database, only 4,000 of the tonnes they have purchased have been actually physically removed from the atmosphere

      If you have increases in temporal resolution for electricity, why not for CDR?

    2. Shopify don’t count the emissions footprint of the products sold by merchants in their actual climate data. No shipping, no manufacturing emissions, nothing (Amazon play a similar trick).

      This an interesting point - shopify can argue they do it to avoid double counting, but that’s not really what scope 3 is designed for

    1. It spent just over $5m to advertise its compact Bolt option over the same time period. Ford spent $61.2m (£48.7m) to advertise its electric F150 , and around $9m to advertise its mid-size electric Mustang. Among the top automotive advertisers, only BMW and Hyundai are spending as much or more to market their more efficient EVs.

      the f150 is its biggest selling autombile. I'd assume this would be higher though, surely?

    1. The CDP, previously known as the Carbon Disclosure Project and the most comprehensive global registry of corporate carbon emission commitments, recently said that of the 19,000 companies with registered plans on its platform, only 81 were credible.

      Sheesh, less than half of one percent of the plans shared with CDP were credible?

    1. Over a year that equates to roughly seven million in savings and is precisely why the biggest names in tech are moving to colder, more remote locations.

      How big does a DC need to be for this 7 million?

    2. Downtime costs roughly $10,000 perminute in a hyperscale and is categorised as the highest risk.

      This is pretty wild quote. I wonder what the source is?

    1. CSP makes it possible for server administrators to reduce or eliminate the vectors by which XSS can occur by specifying the domains that the browser should consider to be valid sources of executable scripts. A CSP compatible browser will then only execute scripts loaded in source files received from those allowed domains, ignoring all other scripts (including inline scripts and event-handling HTML attributes).

      I don't think I've come across this before but I did on a recent project. I didni't know you could block inline styles or inline javascript in this way

    1. The approximate cause of power problems often isn’t that hard to find. Fixing them is often the hard part. Good luck.

      Quotable

    2. Intel processors also support multiple P-states. P0 is the state where the processor is operating at maximum frequency and voltage, and higher-numbered P-states operate at a lower frequency and voltage to reduce power consumption. Processors can have dozens of P-states, but the transitions are controlled by the hardware and OS and so P-states are of less interest to application developers than C-states.

      These exist too, but we can only control them indirectly at best.

    3. Intel processors have aggressive power-saving features. The first is the ability to switch frequently (thousands of times per second) between active and idle states, and there are actually several different kinds of idle states. These different states are called C-states. C0 is the active/busy state, where instructions are being executed. The other states have higher numbers and reflect increasing deeper idle states. The deeper an idle state is, the less power it uses, but the longer it takes to wake up from.

      Mental note: Think "C for the CPU cool down"

    1. Even before Hardin’s ‘The Tragedy of the Commons’ was published, however, the young political scientist Elinor Ostrom had proven him wrong. While Hardin speculated that the tragedy of the commons could be avoided only through total privatisation or total government control, Ostrom had witnessed groundwater users near her native Los Angeles hammer out a system for sharing their coveted resource. Over the next several decades, as a professor at Indiana University Bloomington, she studied collaborative management systems developed by cattle herders in Switzerland, forest dwellers in Japan, and irrigators in the Philippines. These communities had found ways of both preserving a shared resource – pasture, trees, water – and providing their members with a living. Some had been deftly avoiding the tragedy of the commons for centuries; Ostrom was simply one of the first scientists to pay close attention to their traditions, and analyse how and why they worked.
    2. The features of successful systems, Ostrom and her colleagues found, include clear boundaries (the ‘community’ doing the managing must be well-defined); reliable monitoring of the shared resource; a reasonable balance of costs and benefits for participants; a predictable process for the fast and fair resolution of conflicts; an escalating series of punishments for cheaters; and good relationships between the community and other layers of authority, from household heads to international institutions.
    3. Among his proposed solutions to the tragedy of the commons was coercive population control: ‘Freedom to breed is intolerable,’ he wrote in his 1968 essay, and should be countered with ‘mutual coercion, mutually agreed upon’. He feared not only runaway human population growth but the runaway growth of certain populations. What if, he asked in his essay, a religion, race or class ‘adopts overbreeding as a policy to secure its own aggrandisement’? Several years after the publication of ‘The Tragedy of the Commons’, he discouraged the provision of food aid to poorer countries: ‘The less provident and less able will multiply at the expense of the abler and more provident, bringing eventual ruin upon all who share in the commons,’ he predicted. He compared wealthy nations to lifeboats that couldn’t accept more passengers without sinking.

      YIKES

    1. Mobile network operators, often government sanctioned monopolies granted via spectrum licenses, are in increasing competition with an explosion of “open spectrum” technologies and new operating models for wireless networks.
    2. there are well understood risks of turning unchaperoned engineers loose on social problems
    3. there are well understood risks of turning unchaperoned engineers loose on social problems. And more so as the engineering tools become more specialized: if your only tool is an abstract algebraic curve, all the world becomes a cryptographic nail. 

      Crypto as the hammer that turns everythign into a nail.

    4. This liminal space, often vaguely described as the “Internet of Things” (IoT) doesn’t hold much resemblance to the Internet, at least as originally conceived. And the “things” commonly encountered are toasters, washing machines, and toothbrushes, with an often inexplicable desire to “connect.”
    5. IoT’s current failures are also an opportunity. Beyond the absurdity of overly chatty microwaves, plugging the physical world directly into extractive (and often tenuous) business models has raised public awareness of the shortcomings of how we currently build digital infrastructure. Whether it’s a bike that stops working when its manufacturer goes bankrupt, or a home appliance with unclear allegiances, it has created real harms from exfiltration of personal data, and unnecessarily reducing the lifespan or increasing fragility of devices we depend on as part of daily life.

      Overly chatty microwaves, and home appliances with unclear allegiances - there are so many quotable bits in this post!

    6. a home appliance with unclear allegiances,

      This is wonderful turn of phrase

    1. For an example of an unintended consequence, let’s say the result of your optimization project is spare capacity at a cloud provider. Then that capacity is then offered on the spot market, which lowers the spot price, and someone uses those instances for a CPU intensive workload like crypto or AI training, which they only run when the cost is very low. The end result is that capacity could end up using more power than when your were leaving it idle, and the net consequence is an increase in carbon emissions for that cloud region.

      This is because capacity, utilisation and power used are related, but different concepts.

      Your capacity, which you share back to the pool is then avaiable to someone else, who ends up buying it to use for a task that has a higher average utilisation, resulting in more power being used in absolute terms, even if less is attributed to you.

      This also raises the question - who is responsible - the customer for making the capacity available, for the cloud provider who accepts this workload?

      The cloud providers get to set the terms of service for using their platform.

    1. King Kamehameha

      THAT'S where the Kamehameha comes from?

    2. Average retail electricity rates in Hawaiian Electric territory are higher today than they were a decade ago, per state data. Kauai’s rates have dropped from the highest in the state to the lowest, as KIUC shifted from costly imported fossil fuels to cheaper solar generation.

      WOW

    1. but also depend on a global renewable energy production whose capacity cannot exceed 30% globally (EIA),

      I don't understand this reference

    2. It should be noted that in France, regulations do not allow this market-based approach when reporting company level CO2e emissions : “The assessment of the impact of electricity consumption in the GHG emissions report is carried out on the basis of the average emission factor of the electrical network (…) The use of any other factor is prohibited. There is therefore no discrimination by [electricity] supplier to be established when collecting the data.” (Regulatory method V5-BEGES decree).

      Companies are barred from using market based approaches for reporting?

      How does it work for Amazon then?

    1. In the end, there was an undisclosed settlement between Verizon and Mozilla, but ComputerWorld later reported that financial records showed a $338 million payment from Verizon in 2019. On top of revenue-sharing with Google, that payment drove up Mozilla's revenue, which in 2019 reflected "an 84 percent year-over-year increase" that was "easily the most the open source developer has booked in a single year, beating the existing record by more than a quarter of billion dollars," ComputerWorld reported. Perhaps that bonus payment made switching back to Google even more attractive at a time when Baker told the court she "felt strongly that Yahoo was not delivering the search experience we needed and had contracted for."

      Wow, it represented a 340 million USD bonus to switch from Yahoo to Google?

    1. Finally, over a longer time frame, novel technologies could drastically lower battery-related mineral demand for nickel and copper in particular, but the mineral intensity of next-generation battery chemistries remains uncertain and could even increase demand for some battery minerals. For example, solid-state battery chemistries could increase lithium demand by up to 28%.11

      Why are we not talking about copper and nickel, and talking so much about lithium and conbalt?

      Is it just the relative novelty?

    2. Figure 1. Comparison of ore extraction in IEA NZE scenario and sensitivity of substantially improved recycling and potential ore grade decline

      This implies that total mining would go DOWN under a transition, possibly by half.

      The ore grade decline is less of an impact than I expected too.

    3. Ore extraction is largest for EVs, growing 55 times from 2021, compared to 13 and 9 times for solar PV and wind power, respectively

      The share of mining going to cars is not really something we discuss in digital technology discussions enough

    4. This means that the high demand for minerals through the energy transition is of a (temporary) stock building nature, while fossil-related extraction is continuous and dissipative. In the longer term, the decommissioning of end-of-life renewable generation provides opportunities for reuse or recycling. This can mitigate demand of primary produced minerals for new renewable installations

      We can use the byproducts of critical mineral mining. Not so much with fossil fuels

    1. For the following types of tasks, users did NOT appreciate being sent to a new browser tab or window:
      • multistep pages
      • quickly checking a new page rather than a focussed read
      • overloading the browser tab bar
  15. Oct 2023
    1. For starters, if you are seeking to use a green financial instrument to finance your data construction project anywhere in the world, and have European investors, then the Taxonomy Climate Delegated Act (TCDA) will apply. This requires the operator to implement 106 of the EU Code of Conduct for Data Centre (Energy Efficiency) best practices as well as undertake various other activities

      Oh, this is new to me - if you want the cheap money, yo need to meet the ECOCDC

    1. While it is capable of providing constant power, hydrogen fuel cells are also being considered for providing backup power to data centers. This is greatly appealing to data center operators as a more environment-friendly replacement for traditional diesel generators. This change would see the use of fast-start fuel cells, such as proton exchange membrane (PEM) fuel cells which could take the place of diesel generators.

      Proton exchange membranes are presumably the newer types of fuel cells, that are more flexible and need less heat, but are more pricey

    2. The key benefit and primary motivation for installing hydrogen fuel cells within a data center is to reduce carbon emissions. As stated, some fuel cells, such as SOFCs, can use natural gas. While it is less damaging to the environment than diesel, it still results in significant carbon emissions.

      Is this some of the missing context for the CCS next to datacentre patents from M$?

      If you can capture the CO2, and use the waste heat to separate the CO2 from the absorbing material, then it might improve the economics of the SOFC fuel cells, AND deal with the CO2 emissions problem.

    3. SOFCs and PEM fuel cells differ from one another in their construction, materials, and operation. In a high-level view, the primary differences are the electrolyte materials (where the hydrogen and oxygen react) and operating temperatures. SOFCs operate at high temperatures, requiring longer start-up times and as a result, only being suitable for continuous power supply. PEMs, by contrast, operate at lower temperatures and are capable of fast-start or continuous operation, but are a more expensive option.

      Oh wow, so there are two kinds of fuel cells, and the expensive one is the fast ramp up one

    1. decade ago, these plants provided “low carbon” electricity in comparison to the grid at the time, but now in many cases, emit more carbon than local grids. Countries that already have decarbonized grids, France, Sweden, and Scotland, for example, will not benefit from a continuous system that uses natural gas to begin with.

      What would the green premium be for bio methane in this scenario? As in Methane from bio genie sources. Supply issues aside, obvs.

    2. One suggestion is for greater integration of energy systems, which would see data centers located adjacent to energy industries or having data centers integrated with hydrogen-generating plants and fuel cells. This solution sidesteps planning problems by locating data centers alongside low-carbon energy industries. A major issue with this (aside from the available land) is blurring the lines between the data center operators, utility providers, and energy companies.

      This flips the assumptions of what a datacentre looks like, if the power density keeps increasing, instead of a DC with on-site power, you might have a power with on-site DC

    3. From a mechanical perspective, designing a hydrogen storage system is significantly more complex than a diesel storage system. Hydrogen has more storage options available however, it presents higher risks than diesel such as greater flammability and explosivity, higher pressures, potential for low temperature, or chemical storage methods which are all hazardous. This therefore requires the mechanical design for such a system to comply with rigorous safety standards.

      Ok, so h2 unsurprisingly is wast more explodey, and hard to store safely and cheaply

    1. And if you look at the total consumption of semiconductors by the Chinese manufacturing industry, then China imports more semiconductors than they import oil.

      China spends more importing semiconductors than on importing oil? really?

      This is from Peter Wennink, CEO of ASML

    1. Google has argued that switching search engines is just a click away and that people use Google because it's the superior search engine. Google also argued at trial that Microsoft's failures with Bing are "a direct result of Microsoft’s missteps in Internet search."

      This is interesting - I wonder how 3rd parties like Mozilla or Vivaldi testify?

      If they say it's hard, they contradict their own marketing, and risk their main source of revenue.

      If they say it's easy, they risk undermining all their own comms around the importance of choice, and the necessity of more diverse ecosystems.

    1. Provide a transitional implementation for network operators and protocols that do not yet support standards-based Layer 2-3

      This suggests to me that there is a lot of proprietary layer 2-3 in IOT. Is this the case?

    1. Despite the unknowns, the technology definitely piques the interest of owners and operators of data center facilities, with analyst firm Omdia noting that more than 80 percent of survey respondents will most likely deploy grid-interactive UPSs within the next five years. As usual for this industry, the technology will require more deployments and for it to mature before it becomes the standard way data centers and other mission-critical facilities will be built.

      I wonder how much extra flex and capacity this would represent?

    2. While exact numbers are not disclosed, such as battery capacity and how much Microsoft is willing to make available for grid interactivity, the company claims that, over the next couple of years, this move will remove about two million metric tons of carbon dioxide emissions that would otherwise be generated from Ireland’s National Grid.

      This is about an eighth of Microsoft's reported emissions in 2022, I think