87 Matching Annotations
  1. Feb 2020
    1. The Internet's share of the global electricity consumption was 10% in 2014 (Mills, 2013): As a reference, the entire global residential space heating in 2014 consumed the same amount (International Energy Agency, 2017a).

      Heating ALL the homes in all the world is about the same as the internet's carbon footprint according to this paper

  2. Nov 2019
    1. The emissions from onereturn ticket from London to New York are roughly equivalent to that of heating a typical home in the EU for a whole year (European Commission, 2019).

      Holy balls. I've never seen it compared in those terms before.

  3. Oct 2019
    1. If the current de-velopments determined by Borderstep continue, the energy consumption of data centers will double by 2030 compared to today.

      So, doubling from today assumes around 700 kWh in 2030

    2. While Shehabi et al. assume that maximum power consumptionis constant[4], the Borderstep model assumes an increase in maximum power consumption due to a significant increase in the average amount of RAMs and multiprocessor systems.

      Ah! That makes sense - a big load out in a server, with loads of cores and RAM is deffo gonna make it need more juice.

    3. According to a Borderstep Instituteestimate, worldwide energy consumption of server data centers increased by about 30% to 287 billion kWh between 2010 and 2015 [9]. This in-crease accelerated once again in the last two years. A current TEMPRO project estimate concludes that between 2015 and 2017 theenergy consumption of data centers worldwide in-creased by approx. 20% to 350 billion kWh

      By comparison, the IEA figures generally say energy use has stayed around 200 kWh in 2020.

    4. The analyses and results presented in this article were pro-duced as part of the project TEMPRO –"Total Energy Man-agement for professional data centers." TEMPRO is supported through the 6th Energy Research Programme of the German FederalGovernment.

      Are there similar studies in other countries?

    Tags

    Annotators

  4. Sep 2019
    1. Our approach to quantifying our carbon footprint reflects the complexity of our business. Our team of researchers and scientists have combined cutting-edge life cycle assessment (LCA) science and Amazon Web Services (AWS) big data technology to develop a robust software solution that processes billions of operational and financial records from Amazon’s operations across the world to calculate our carbon footprint. The software estimates carbon emissions for all activities within our system boundary using a dollar-based environmental assessment model, then enhances the accuracy of carbon-intensive activities with detailed, process-based LCA models.

      This sounds a lot like a top down approach as outlined by systems like Ecocost or Trucost, then supplemented with bottom up modelling of specific processes

    1. Purchase green power certificates. In July 2017, China launched a pilot program that permits voluntary trade of green power certificates from solar and wind power. Each certificate represents 1 MGh of electricity. Buying green power certificates allows companies to claim environmental benefits associated with renewable energy generation, even if electricity from a renewable power plant does not feed directly into a data center facility.

      Oh, wow, they do RECS too now? I wonder if they publish them too

    2. 196.96

      this forecast is almost the same as the IEA's earlier report for global demand for energy from datacentres

    1. 194

      This basically says that efficiency gains, mainly from cloud, are holding back absolute growth in energy use.

  5. Aug 2019
    1. If you’re part of an existing climate campaigning group, think of how you can participate in or help organise around September 20 to keep the momentum going. Link your action explicitly to the school strikes (“We’re doing this because we’re answering the call of the striking students, and taking action”).
    1. InnovationThis ice-making submarine would pop out bergs to help fight climate changeThe proposed vessel would ply polar waters, but scientists have their doubts about its effectiveness.
  6. Jul 2019
    1. If you are an individual working, or planning to work, in this industry, then by signing your declare you won’t work on fossil fuel clients. No one is policing you or checking up, this is a promise to yourself.

      Explicit about the monitoring here.

    2. If you lead an agency, by signing you promise to disclose your turnover by sector, and highlight any climate conflicts. Check the Client Disclosure Reports on this site to see what we mean. Your deadline is end 2019 to disclose.

      Simple, explicit. Disclose turnover by sector, they have an example.

    3. Our the ultimate goal is to divest creative talent from destruction.

      This is some good wordsmithery

    1. The backend data repository is based on MySQL. Therepository contains 16 tables that captures the varioussource information described above. The current size of therepository (excluding the real-time data) is 92 MB.

      100mb for all the data, mapping all that infra?

    2. Provider maps often contain additional information aboutnetwork node resources. This information can range from lo-cation (potentially down to Lat/Lon coordinates), to IP ad-dresses, to resource or service types. Our ability to extractnetwork node information from the discovered resources isdependent on an assembly of scripts that include Flash-based extraction and parsing tools [3], optical characterrecognition parsing tools [7], PDF-based parsing tools [8],in addition to standard text manipulation tools. This li-brary of parsing scripts can extract information and enterit into database automatically. For instances where none ofthe tools or scripts are successful on the provider data, wemanually parse and enter the data.

      This sounds like an meaningful thing you could use OSM to augment, and the argument for doing it makes sense - "you like the internet right? So help map it, before you lose it"

    3. Visualization-centric representations often reveal no in-formation about link paths other than connectivity (e.g.,line-of-sight abstractions are common). For these we en-ter the network adjacency graph by hand into Atlas. How-ever, some maps provide highly detailed geographic layoutsof fiber conduit connectivity (e.g.,Level3 [5]). We transcribethese, maintaining geographic accuracy, into the Atlas us-ing a process and scripts that(i)capture high resolutionsub-images,(ii)patch sub-images into a composite image,(iii)extract a network link image using color masking tech-niques,(iv)project the link-only image into ArcGIS usinggeographic reference points (e.g.,cities), and(v)use linkvectorization in ArcGIS to enable analysis (e.g.,distanceestimation) of the links.

      So, it sounds like they're using some kind of computer viz to pull compare a set of tiles to some other image showing the infrastructure, to work out the rough coords to project onto a map

    4. In addition to Internet search, we appeal to the largenumber of existing Internet systems and publicly availabledata they provide. This includes PeeringDB [9], NetworkTime Protocol (NTP) servers, Domain Name System servers(DNS), listings of Internet Exchange Points (IXPs), Look-ing Glass servers, traceroute servers, Network Access Points,etc.Beyond their intrinsic interest, it is important to rec-ognize that NTP servers [6] often publish their Lat/Lon co-ordinates and are typically co-located with other network-ing/computing equipment. Similarly, DNS servers routinelypublish their location via the LOC record [18]. In total, over4,700 network resources of various types are annotated inthe Internet Atlas database.

      Okay, this is properly smart, and use pretty much all the data sources I would have thought to look at.

    5. Third, the de facto useof IP addresses gathered from TTL-limited probing cam-paigns as the basis for inferring structure has inherent diffi-culties. These include the well known interface disambigua-tion problem [26], widely varying policies on probe blockingamong providers, and difficulties in managing large scalemeasurement infrastructures [24]. We believe that a differ-ent approach to building and maintaining a repository ofInternet maps is required.

      So, this basically says traceroute by itself isn't enough

    1. Given the large number of nodes and miles of fiberconduit that are at risk, the key takeaway is thatdevelopingmitigation strategies should begin soon.

      So, I guess this would be a reasonable question to ask, right?

    2. To localize overlap we develop a Coastal Infrastructure Risk(CIR) metric that highlights the concentration of Internetinfrastructure per geographic location (e.g., city). The CIRmetric will be used to elucidate the impact of sea level riseon Internet assets temporally. Using CIR, we identify the top10 major geographic locations most at risk, and thus in needof action by municipalities and service providers to secureexisting deployments and plan for new deployments.

      Wow, this is equal parts fascinating and horrifying

    3. The firstis the communication fiber conduit and termination pointdata in Internet Atlas

      AH HA! THAT'S Where the data is from.

    4. We also quantify therisks to individual service provider infrastructures and findthat CenturyLink, Inteliquent, and AT&T are at highest risk.

      Would these show up in their SASB or Climate Risk Reporting (i..e Task Force on Climate Risk)

    5. In this paper we consider the risks to Internet infrastructurein the US due to sea level rise. Our study is based on sealevel incursion projections from the National Oceanic andAtmospheric Administration (NOAA) [12] and Internet in-frastructure deployment data from Internet Atlas [24]. Wealign the data formats and assess risks in terms of the amountand type of infrastructure that will be under water in dif-ferent time intervals over the next 100 years. We find that4,067 miles of fiber conduit will be under water and 1,101nodes (e.g.,points of presence and colocation centers) willbe surrounded by water in the next 15 years.
    1. incorporate suppliers with negative impacts on sustainability (identified in level 1) into the audit process as part of risk management.

      This seems to imply suppliers who can't disclose emissions would be seen as a risk in future

    1. Mark Parrington, a senior scientist at the European Centre for Medium-Range Weather Forecast, said the amount of CO2 emitted by Arctic wildfires between 1 June and 21 July 2019 is around 100 megatonnes and is approaching the entire 2017 fossil fuel CO2 emissions of Belgium.

      Fuuuuuuuucking hell. The last 20 days of Arctic wildfires is the same as all the fossil fuel emissions from Belgium last year.

    1. provide a regularly refreshed set of minimum ICT sustainability provisions (including energy/carbon reporting)

      OK. Here's what I would ask to see in an FoI. I imagine any other organisation thinking about the emissions from digital services would also benefit from seeing these, as

      a) they're likely covered by the binding legal targets set for the UK b) I've spoken to a few asking me for some myself

    2. quantify and report on its e-waste and energy and carbon footprint of the digital and technology services used and their sustainability impacts

      So, this looks like a pretty explicit commitment to measure the carbon footprint of digital services to me.

      It seems like it might be FOI-able, as Paul suggested. Anyone?

  7. Jun 2019
  8. May 2019
  9. Apr 2019
    1. Air pollution contributed to nearly one in every 10 deaths in 2017, making it a bigger killer than malaria and road accidents and comparable to smoking, according to the State of Global Air (SOGA) 2019 study published on Wednesday. In south Asia, children can expect to have their lives cut short by 30 months, and in sub-Saharan Africa by 24 months, because of a combination of outdoor air pollution caused by traffic and industry, and dirty air indoors, largely from cooking fires. In east Asia, air pollution will shorten children’s lives by an estimated 23 months. However, the life expectancy burden is forecast to be less than five months for children in the developed world.

      And we still SUBSIDISE fossil fuels

  10. Mar 2019
    1. Despite the existence of these two contradictory trends, even the most optimistic studies express concerns regarding the capacity of technological progress to counter the growth in volumes by 2020.For example, this report from the American Department of Energy and the University of California on the energy consumption of data centers in 2016 in the United States, states:"The key levers for optimizing theenergy efficiency [of data centers] identified in this report, better PUE, better rate of use of servers and more linear consumption all have theoretical and practical limits and the amount of progress already achieved suggests that these limits will be reached in the relatively near future."(Shehabi, A. et al., 2016)

      Okay it was that same paper they referred to.

    2. India plans to launch a massive program to deploy commercial 5G networks in 2020 to boost the performance and capacity of existing mobile networks, taking into account that the 4G networks (which only took off in 2017 due to a price war over data started by the telecommunications operator Reliance Jio)are making big advances towards general coverage.

      Hello, so they do reference the massive increase in data and data plans

    3. Scenario 3 –ideal case, where the exchanges are made exclusively with the platform.

      His number seems super duper high

    4. We have therefore sought to identify levers of action more closely related to the demand and consumption of digital services than on the energy efficiency of supply.

      This is long overdue, I'm ready Glad to see this

    5. Spending 10 minutes watching a high definition video by streamingon a smartphone is equivalent to using a 2,000W electric oven at full power for 5 minutes

      Waaaaaaaat

  11. Feb 2019
    1. Last year, Google quietly started an oil, gas, and energy division. It hired Darryl Willis, a 25-year veteran of BP, to head up what the Wall Street Journal described as “part of a new group Google has created to court the oil and gas industry.” As the VP of Google Cloud Oil, Gas, and Energy, Willis spent the year pitching energy companies on partnerships and lucrative deals. “If it has to do with heating, lighting or mobility for human beings on this planet, we’re interested in it,” Mr. Willis told the Journal. “Our plan is to be the partner of choice for the energy industry.”

      Jeez. At what point do we grow a spine and take climate change seriously?

    1. Salesforce was the first major internet company that exclusively leased data center space to adopt a 100 percent renewable energy commitment in 2013. Salesforce has multiple data center leases in Data Center Alley, totaling 46 megawatts, including a massive new lease with QTS in its new Manassas data center.[

      How to do green DCs when you don't own DCs

    2. But despite recent creative claims of being “100 Percent Renewable Globally” from surplus supply of renewable credits in other markets,[66] Google has not yet taken steps to add renewable energy to meet the demand of its data centers in Virginia

      Ah! So they do the "RECs in other markets" too!

    3. In 2018, five major IT brands with long-term commitments to renewable energy[52] and who operate data centers or have significant colocation leases in Virginia sent a letter to the Virginia State Corporation Commision (SCC) asking that they not be used by Dominion to justify new fossil fuel growth, asking instead for a greater supply of renewable energy.[53] The SCC ultimately rejected Dominion’s Integrated Resource Plan for the first time in December 2018, providing an important opportunity for additional large corporate customers to tell regulators they need a greater supply of renewables, not more investment in fossil fuel generation assets or pipelines like the ACP.[54]

      Wait, so these things two things are related? The letter forced the SCC to respond?

    4. The rapid deployment of renewable energy and the stagnation of mandatory renewable energy targets in many states has created a large surplus of “naked” or unbundled renewable credits available at the national level for purchase by the voluntary market, driving their price to record lows, less than $1/megawatt hour.

      So, if you're a huge buyer of electricity, and you are opaque about your offsets, it's easy to imagine that you're just loading up on these.

    5. AWS customers seeking to immediately reduce carbon emissions related to their cloud hosting could request to be hosted in Amazon’s California cloud, which is connected to a grid that is 50[33] to 70[34] percent powered by clean sources of electricity

      Not oregon?

    6. Dominion’s projected demand for the pipeline ignores the fact that six of its 20 largest customers, five of which are data center operators, have made commitments to run on 100 percent renewable energy.[

      How can you publicly audit a commitment like this?

    7. However, neither of these options improves the energy mix of Virginia or influences future direction and is therefore not ideal for those companies concerned with meaningfully reducing their operational carbon emissions. Of the 15 companies measured in this report, only Apple has invested in enough renewable energy procurement to match its demand in the region

      Ok, this makes me think that companies are relying on RECs everywhere else, and crediting Apple with specifically investing directly in RE in Virginia.

    8. Company Scorecard

      OK, so this is the table used to create that wild chart above showing DC capacity, compared to renewables capacity in Virginia

    9. If Amazon and other internet companies continue their rapid expansion of data centers in Virginia, but allow Dominion to continue with its strategy to use rising data center demand to justify significant new investment in fossil fuel infrastructure, they will be responsible for driving a massive new investment in fossil fuels that the planet cannot afford.

      So this is interesting. This report seems to be more about Dominion than anything, else, and basically pressuring amazon to get Dominion to step away from fossil fuels

    10. Dominion Energy, Virginia’s largest electricity provider and the primary electric utility for Data Center Alley, has strongly resisted any meaningful transition to renewable sources of electricity, currently representing only 4 percent of its generation mix, with plans to increase to only slightly over 10 percent by 2030.[1]

      Wow, 10% by 2030? That it?

    1. pite the coal-friendly policies of the central government. A study showed that Australia is currently installing 250 watts of PV or wind for each inhabitant per year. The EU and US are about one fifth of this. If this rate of growth continues, Australia will reach 50% renewables by 2024 and 100% of electricity demand by 2032. Costs of new large scale PV and wind are now around US35/MWh, lower than the running costs of older coal stations.

      Wow, go Australia

    1. Williams and Tang (2013)8performed a rigorous and detailed energy consumption analysis of three cloud-based office productivity applications. They analyzed the power consumption ofthe data center, network, and user devices that access the cloud service. The study also performed an energy consumption analysis on “traditional” noncloud versions of the software to understand the overall impact of cloud services.

      Are the findings accessible publicly?

    2. Average power consumption can be estimated from single points, although this results in increasing uncertainty. Manufacturers publish the maximum measured electricity (MME)value, which is the maximum observed power consumption by a server model. The MME can often be calculated with online tools, which may allow the specification of individual components for a particular server configuration. Based on these estimations of maximum power consumption, the average power consumption is commonly assumed to be 60 percent of MME for high-end servers and 40 percent for volume and mid-range servers.

      okay, this is a useful stat. I think

    3. In most cases, the “embodied emissions” (all stages excluding the use stage) of software are not significant compared with the overall emissions of the ICT system, particularly when the embodied emissions caused by development of the software are amortized over a large number of copies. In these cases, it is not necessary to carry out a detailed life cycle assessment of the software as part of a wider system. An exception is where bespoke software has very high emissions associated with its development, and these emissions are all allocated to a small number of software copies.

      Smaller, internal software might count

    4. Currently, input-output(IO)tables are published every five years, a long time in IT product evolution. Consequently, EEIO is good at representing basic commodities / materials industries like plastics or metals manufacturing, but not high-tech industries like microprocessors and fiber optic lasers manufacturing.

      Every 5 years. So, when the iPhone 6 was the brand new hotness, compared to today.

    5. Calculating cradle-to-gate GHG emissions of IH by the component characterization method

      Ah! This is new to m

    6. EEIO tables are updated infrequently thus may not be up to date with ICT’s newest technologies and materials. EEIO tables have limited resolution at the aggregate sector level.

      Valuable problem to solve?

    7. rapidly with the onset of innovations, but lag in being included in EEIO databases available to the practitioner. More detail on EEIO data is provided in the calculation sections below

      Useful point. Because the top-down data is lagging, it'll give worse than expecred figures for hardware

    8. It is interesting to note that the figures from GSMA and GeSI show that energy intensity per gigabyte is improving at about 24% per year for mobile networks, and at about 22% per year for fixed line networks.(The study by Aslan et al calculates a figure of 50% reduction in energy intensity every two years for fixed line networks, equivalent to 29% reduction per year).Also the data shows that the energy intensity per gigabyte for mobile networks is about 50 times that for fixed line networks.

      Okay, this isn't that far from the 45x figure before

    9. Assuming that the reduction in energy efficiency can be fitted to an exponentially decreasing curve (i.e. because it is more and more difficult to achieve the same reductions), then the data points can be extrapolated to give energy intensity factors for 2015 of 0.15 for fixed linenetworks, and 6.5 for mobile networks, with both factors measured in kWh/GB (kilowatt-hours per gigabyte).

      FORTY FIVE TIMES MORE ENERGY INTENSIVE THAN WIRED

    10. A simple energy intensity factor for the use of the internet would make calculating the emissions resulting from ICT simpler and more widely accessible. Whilst this has been attempted in the past, resulting estimates show huge disparities. Coroama and Hilty6review 10 studies that have attempted to estimate the average energy intensity of the internet where estimates varied from 0.0064 kWh/GB to 136 kWh/GB, a difference factor of more than 20,000.

      TWENTY THOUSAND TIMES DIFFERENCE.

      Did I drive across London? Or did I drive to the moon?

    11. Typically, for cloud and data center services the largest impacts are from the use stage emissions of the data center and the end user devices.

      End user devices?

    12. Simplified parameters for allocation of data center emissions include

      Useful for the screening stage

    13. Optional processes that are not attributable to the GHG impact of cloud and data center services are:

      ?

    14. The end-of-life stage typically represents only -0.5 to -2 percent of a service’s total GHG emissions. This is because of the high level of recycling of network equipment.

      Where would I check to learn this?

    15. For global average electricity emission factors across 63 countries where the points of presence were located, data from the Carbon Trust Footprint Expert Database was used.

      Is this database free?

    16. Operational activities and non-ICT support equipment covers people (labor)-activities and non-ICT support equipment and activities that are directly engaged and dedicated to the service being assessed.

      in addition to the other emissions

    17. For example, these measurements might involve running a series of traffic traces16over a period of time to build up statistics on network parameters. The measurements also need to include the energy consumption for the network’s ancillary equipment such as cooling, power conditioning, and back-up power. If this latter data is not attainable, then techniques described in Section 2.8.2“Calculating GHG emissions for the customer domain use stage,” (TPCF and PUE factors), can be used to provide an estimated value for this equipment.

      Validation!

    18. Equipment manufacturers may have estimates of TPCFs for their equipment based on defined operating conditions. In all cases, if a TPCF approach is selected, then the basis for selecting the factor should be fullynoted and documented in the GHG inventory report

      How much do these figures fluctuate for cloud boxen?

    19. Allocation of emissions among independent products that share the same process: for example, multiple products sharing the same transport process (vehicle); multiple telecommunication services sharing the same network; multiple cloud services (email, data storage, database applications) sharing the same data center

      k8s makes this a pain, if its designed to co-mingle services on the same boxen

    20. A “screening assessment” is an initial assessment of a product to understand its significant and relevant sources of emissions. This assessment is described in the Product Standard in section 8.3.3.

      Entry level

    21. This chapter provides software developers and architects guidance to benchmark and report the GHG emissions from software use in a consistent manner and make informed choices to reduce greenhouse gas emissions. The chapter is in two parts. Part A provides guidance on the full life cycle assessment of software, while Part B relates specifically to the energy use of software, and covers the three categories of software: operating systems (OS), applications, and virtualization.

      actual formal guidance!

    22. 2015 GeSI published the SMARTer 20308report, extending the analysis out to 2030. This study predicted that the global emissions of the ICT sector will be 1.25 Gt CO2e in 2030 (or 1.97% of global emissions), and emissions avoided through the use of ICT will be 12 Gt CO2e,which is nearly 10 times higher than ICT’s own emissions.

      theres a 2030 report now. I did not know

    23. the total abatement potential from ICT solutions by 2020 is seven times its own emissions.

      increasing confidence in the potental then

    24. Rebound Effects

      🤯

    25. The Product Standard (sections 11.2 and 11.3.2) states that“avoided emissions shall not be deducted from the product’s total inventory results, but may be reported separately.”

      ah, so its not like a magic offset. more transparent. good.

    26. The Product Standarddefines products to be both goods and services, thus for the ICT sector it covers both physical ICT equipment and delivered ICT services. This Sector Guidance, however, focuses more on the assessment of ICT services. In this Sector Guidance the definition of products includes both networks and software as ICT services.

      this makes me think that services like e-commerce or ride sharing might not count on he first read thru

  12. Dec 2018
    1. As the chief executive of the world’s biggest cement company observed, “we know how to make very low carbon cement – by why would we? There is no incentive.”

      FUCKING HELL

    2. In the growing trade war between China and the US, it seems the world is unwilling even to think about the entirely legitimate use of consumption-based or border carbon pricing either to encourage cleaner production in China, or to deter the Trump administration from using discriminatory trade measures to re-industrialize drawing partly on older and more carbon-intensive technologies.

      How would border carbon pricing work? You pay a tax on the CO2 emissions 'imported'?

    3. The European utilities that tried to ignore the energy transition are now economic zombies; some split their companies in two to try and isolate the assets that have already turned into liabilities in a decarbonizing system.

      E-on as an example?

    4. Averaged over the full 35 years, a constant percentage reduction would require c = - 10%/yr to reach the same end-point – almost impossible at the starting point, but entirely feasible and easily observed in the latter stages of sunset industries.

      Okay, so the argument as I see it so far is that, change, while averaged out might be 3.68 per year, but assuming it's a straight line, is a mistake, as substition of high carbon energy to low carbon looks more like an S shaped curve

    5. their analysis leads both teams to the – only slightly caveated - conclusion that the emission reductions required to the deliver the Paris Aims (“well below 2 deg.C”) are implausible, by almost any standard of macroeconomic evidence – and still more so for the most ambitious “1.5 deg.C” end of the spectrum.

      Ah, so this is a response to the we're doomed papers from before

    1. Electricity Intensity of Internet Data Transmission: Untangling the Estimates

      This is the html version of the PDf I was referring to before.

  13. Oct 2018
    1. Video streaming service Netflix is the world's most data-hungry application, consuming 15% of global net traffic, according to research from bandwidth management company Sandvine.

      Ah, there's new sandvine report for 2018

  14. Sep 2018
  15. Nov 2017
    1. Figure 4: Typical diurnal cycle for traffi c in the Internet. The scale on the vertical axis is the percentage of total users of the service that are on-line at the time indicated on the horizontal axis. (Source: [21])

      I can't see an easy way to link to this graph itself, but this reference should make it easier to get to this image in future