1,183 Matching Annotations
  1. Nov 2021
    1. Supporters of the Paris Agreement argue that such concessions were necessary to produce a successful deal. Moreover, former US climate negotiator Todd Stern has written in Brookings that legally binding targets, “paradoxically, can yield weaker action as some countries low-ball their targets for fear of legal liability”.

      So the argument he's making is that countries wouldn't set such ambitious targets if they knew they might be sued for not delivering on them?

    2. At the Durban COP in 2011, nations agreed to forge ahead with a new deal that, unlike Kyoto, would be “applicable to all” nations, both “developing” and “developed”. The EU’s lead negotiator, Elina Bardram, said it “must reflect today’s reality and evolve as the world does”, while US lead negotiator Todd Stern stated their position plainly: “If equity is in, we are out.”

      “If equity is in, we are out.” Wow

    3. The map below shows the results of their most recent study (pdf). “Climate risk index” is mapped for most countries in the world – where a lower score and darker colouring indicates that the country has greater exposure and vulnerability to extremes.

      Why such a high risk outlier in Western Europe?

    4. One study finds that in more than 70% of US counties, “neighborhoods with lower-income and higher shares of non-white residents experience significantly more extreme surface urban heat than their wealthier, whiter counterparts”. Another study has found that black people are 52% more likely than white people to live in areas of unnatural “heat risk-related land cover”. Asian people are 32% more likely and Hispanics 21%.

      Check if there is any overlap between these counties and common datacentres clusters

    5. For example, in June of this year, the city of Jacobabad in Pakistan reached 52C, pushing it over the 35C wet bulb temperature limit. This made the city one of only two places on earth to have officially passed the threshold. The Daily Telegraph reported that the extreme heat was exacerbated by the lack of wealth and resources in the region

      holy shit. it's already here?

    6. “To actively promote such fossil-fuel development and then punish developing countries for emissions through carbon border adjustments is, at best, hypocritical. It’s also unjust.”

      Ah ok.

      So you lend money to build fossil infra, then you make it hard to buy stuff made with the infra compared to your own clean infra supply chain.

      the result is the person you lend money to losing out, as they only ever get finance for fossil fuel infra, and can never compete

    7. This is why carbon border adjustments of the variety being proposed by the EU, which place taxes on imported goods based on the emissions associated with their production, have been criticised and even labelled a form of “economic imperialism”.

      A CBAM (carbon border adjustment mechanism) as a form of economic imperialism? I've heard of protectionism, but I hadn't considered that take.

    8. One study found that most European countries would be unable to meet their removal targets domestically, meaning they would have to rely on climate finance or offsetting schemes to support removals abroad.

      Oh.

    9. “The closer we get to 1.5C…the question becomes less about fair shares in terms of greenhouse gas mitigation and more about fair shares in terms of CO2 removal as well.”

      This might be a framing around the carbon colonialism thing - you only get to do negative emissions if they're domestic ones.

      Given how much energy DAC needs, you likely still end up with domestic investment in clean power, so there's still a material footprint

    10. In the UK, the Climate Change Committee has advised that the nation should meet its net-zero target domestically, “without relying on international carbon units”.

      This is directly in contradiction with basically every climateTech firm I can think of in the UK right now

    11. However, once again the pushback is largely based on the extent to which these targets rely on offsets and carbon removals at the perceived expense of emissions cuts, particularly if the effort is shifted to the global south in acts of “carbon colonialism”.

      Carbon colonialism is a new term for me.

    12. In 2004, academics and NGOs signed the Durban Declaration on Carbon Trading, which criticised the way in which markets turned “the earth’s carbon-cycling capacity into property to be bought or sold in a global market”. This became a key foundational text for climate justice groups.

      Oh jeez, there's just so much to read

    13. Criticisms were levelled at the Kyoto Protocol after the US pushed for the inclusion of a carbon market in the agreement.

      I did not know they pushed for the carbon market, but it makes sense. If you have a market based instrument that you can make cheaper than action, it'll be politically attractive, as it means you don't need to change so much

    14. Harjeet Singh, a senior adviser on climate impacts at CAN International, tells Carbon Brief that it is clear why wealthy nations want to avoid admitting responsibility: “The numbers are massive and rich countries and corporations know that they are going to be blamed and that’s why they have always been scared of recognising loss and damage and the related compensation and liability provision.”

      Note to self, check this against this:

      https://climateprinciplesforenterprises.org/

    15. With the UNEP adaptation gap report suggesting that adaptation costs “in developing countries alone” could reach $140-300bn in 2030, there are widespread calls for at least half of climate finance to go towards adaptation. This echoes language in the Paris Agreement urging a “balance between adaptation and mitigation” finance.

      Half / half.

      Useful reference as typically adaptation is often presented as a way to excuse inaction, or generally seen as an admission of defeat

    16. Most climate finance also comes in the form of loans

      Given interest rates are hovering around 0% what kind of rates are offered here? Cost of capital for renwables projects in the global south is notoriously high compared to the global north.

    17. “What we are asking is repayment…We are not begging for aid. We want developed countries to comply with their obligation and pay their debt.”

      That's a pretty powerful framing that with the "carbon space" concept

    18. This is recognised in the Bali Principles of Climate Justice, developed by NGOs in 2002, which note that “unsustainable consumption exists primarily in the north, but also among elites within the south”. Indian journalist Praful Bidwai has described the focus on national per capita emissions as “a shield that enables India’s elite to hide behind the poor”.

      "A shield to hide behind the poor." Crikey Moses.

    19. For example, in its most recent update, Climate Action Tracker (CAT) – an independent analysis of climate pledges produced by two research organisations – deemed both India and China’s commitments to be “highly insufficient” based on their “fair shares”.

      Ah, the other Climate Action T - org. These say that even China and India aren't doing enough based on the fair shares

    20. National pledges as of 2018 expressed in CO2e per capita emissions cuts (black lines), compared to different “fairness” benchmarks. Orange bars are based on national capacity alone. Blue bars are based on national historical responsibility alone. Green bars reflect both capacity and responsibility equally. The grey bar represents a “political” benchmark based on low progressivity and low responsibility settings. Bright colours represent a “high progressivity” setting and dim colours represent a “low progressivity” setting. Source: Civil Society Review.

      This took me a while to make sense of but the simple version is that where a bar is higher than the black line, there is capacity to do more, but the countries are not. The closer the bars are to the line the closer the efforts are to the "fair" share

    21. Concerns about intergenerational injustices, demonstrated more recently by the Fridays for Future protests, are one of the “four pillars of climate justice” identified by Srivastava and her colleagues in a recent paper. The others are: Distributional – how the costs and benefits of climate change and action are shared.Procedural – ensuring the processes for making decisions about the impacts of and responses to climate change are fair, accountable and transparent.Recognition – Recognising differences between groups in how they experience climate change and their right to express these differences.

      Aaaaah! There's 4 not 3! There's an EXPLICIT reference to intergenerational justice too.

      Note to self: always listen to Melissa.

    22. Using the same methodology, which prioritises nations’ historical responsibility and financial capacity to act, the US branch of Climate Action Network (CAN) has called for a “fair share” target of cutting US emissions by 195% below 2005 levels by 2030. NGOs in the UK have called for a similar reduction target of 200% below 1990 levels.  The current domestic targets for the US and the UK are 50-52% and 68%, respectively.

      Wow, I this piece if eye opening - I had no idea the "fair shares" were that much larger

    23. many fairness justifications used in NDCs, such as allocating emissions cuts where it is cheapest to do so, relied on arguments that were not supported by principles of international law. Sticking to these legal principles would tend to require deeper cuts from developed countries, the paper found.

      I didn't know legal principles would follow this.

    1. The work done must be measured and must be part of a pre-existing environmental strategy.

      This implies uniformity in how you report it. If you have an attributional approach for measuring impact at an organisational level, you'll need a way to convert between the two if you are using a consequential method for a project.

  2. Oct 2021
    1. Organizations today typically must assess their spending records and then look up tables that estimate the average emissions associated with them. This falls far short of what the world really needs, which is the ability to pull accurate and near real-time data directly from the emissions sources themselves.

      OK, so this suggests they're intending to eventually replace the extended environmental input output models that are the defaults in most carbon accounting tools with their own models.

      My guess is this would be their moat in many cases.

    1. To obtain the best set of data, we commissioned two active telecom sites in Peru. One is the baseline site that uses conventional telecom sizing and operational methodology. The other is a smartly designed site that uses fewer solar panels and batteries. By commissioning these two sites side by side, we can compare their performance over time and track relevant telecom performance indicators, such as number of connections, bandwidth, and reliability. We believe that significant savings in power costs — on the order of 40 percent to 60 percent — are possible while maintaining relevant telecom performance.

      Wow, so actively managing demand effectively HALVED the energy requirements, although it's not obvious what "while maintaining relevant telecom performance" means, as the diagram above basically suggests dropping down to 3G or 2G when power is low.

    2. Conventional telecom power system sizing of a solar-powered site is based on (1) the worst-case historic irradiance in the installation site, which can be much worse than the average irradiance, and (2) the average power consumption of the telecom system, which typically remains static and invariable over the time, no matter how the weather is and even when most of the people are sleeping and the traffic is close to zero.

      So a key thing here is that earlier thinking assumes static load the entire time. There is no real notion of scaling power usage up and down when designing the system

    3. One of our most recent collaborations is Project SEISMIC: Smart Energy Infrastructure for Mobile Internet Connectivity. In this project, we are developing a solution to smartly manage the power and functionality of telecom sites. For example, we can reduce the capacity and transmission power of the site during less busy periods. By doing so, we want to better design and operate off-grid sites in order to reduce cost and improve their sustainability.

      As I understand this piece is about better power management for sites that are off the power grid, for providing connectivity

    1. Table 2 - Energy efficiencies of mobile networks

      These figures are quite a bit higher than the ones from Ericsson, and even from the IEA's analysis - they assume between 0.1 and 0.2 , and this is 0.35 for 4G

    2. Therefore, we take the 17% of app traffic associated with ATS found by Vallina-Rodriguez et al., (2016) as an upper boundary, also because in some video apps lots of advertisements are shown, and as a lower boundary we assume 10%, to use a conservative range for the carbon footprint of unwanted data-use.

      So, they assume between 1/6th and 1/10th of total transfer is unwanted ad-traffic.

    3. We assumed that the electricity use is distributed all over Europe. In the calculation we used the actual European electricity grid mix from 2019.

      OK this is good. The average grid mix might be a tad high if it isn't a weighted average, as I don't think internet usage and penetration is uniform across Europe. America, then the UK, and then Germany are the largest English speaking markets online, IIRC, so it might be worth looking at how this might affect the carbon intensity used per kilowatt hour of electricity

    4. To compensate for 3 to 8 Mt CO2 emissions, 160 to 410 million trees need to grow for one year or between 60 and 150 million PV panels need to be installed to replace the average European electricity production.

      There are other stats that show the mortality from excess emissions, often used to show that airliners end up causing numbers of deaths that if taken into account, would ground them based on their own safety standards

    5. This analysis results in an estimated total carbon footprint of ATS data-use in Europe of between 5 and 14 Mt CO2-eq. per year. The carbon footprint of unwanted data-use by ATS is estimated to be between 3 and 8 Mt CO2-eq. per year. This is comparable to the CO2 emissions of 0.7 to 1.8 1,000 MW coal-fired powerplants, the CO2 emissions of European cities such as Turin or Lisbon, or the CO2 emissions of 370 to 950 thousand European citizens.

      So, almost a million Europeans

    6. In this study we define the unwanted data-use by smartphone apps as the network data (both cellular and Wi-Fi) used to transfer the data collected by third-party advertisement and tracking services (ATS) in smartphone apps to the third-party servers. As approximately 60% of the consumers indicate that they would turn off third-party tracking, 60% of the network data used by ATS is qualified as unwanted.

      So this is where the 60% figure comes from, for later reference

    7. Smartphone apps collect more user data (also called tracking) than most consumers prefer. Besides privacy risks, the collection of user data also results in a carbon footprint as sending this user data to third parties and showing personalised advertisements consumes network data. When the user tracking is unwanted, also the consumption of network data for tracking is unwanted.

      Another way of framing this might be shifting the cost of the web onto everyone else affected by our changing climate.

    Tags

    Annotators

    1. This is the result of using high and outdated energy-use assumptions for various access modes – for example, 0.9 kWh/GB for “mobile” compared to more recent peer-reviewed estimates of 0.1-0.2 kWh/GB for 4G mobile in 2019.

      These numbers are less than half the Delft University ones

    1. The German National Action Plan (NAP) on business and human rights sets out the expectation for at least 50% of German companies with more than 500 employees to have introduced effective human rights due diligence measures into their business processes by 2020. According to its coalition agreement, the Government has committed to taking legal action if this target is not reached. As part of a monitoring process the Government is currently reviewing to what extent companies are meeting their due diligence obligations.On 10 July 2019 the Federal Foreign Office published the first interim report, outlining the methodology and the questionnaire for the 2019 survey. NGOs, media and parliamentarians criticized in particular the inclusion of additional evaluation groups 'Companies with implementation plan' and 'Companies on the right track' (BHRRC translation) as well as the current failure to consider 'non-responders'.

      wow, I was late to the party - I didn't know the threshold was so low for having this

    1. On every execution, the Function node compares the incoming data with the data from the previous execution. If the data got changed, we pass it to the next node in the workflow. We also update the static data with this new data so that the next execution knows what data gets stored in the previous node. If the data did not get changed, you may return a message based on our use-case.

      A-ha! If you can use this to set a 'bookmark' for the latest row in a spreadsheet/CSV for example, you have a nice way to query for "all the things since when I last ran this last Sunday"

    1. Ensuring that the regional market is competitive and that there are incentives for companies to buy local cloud infrastructure is a role that only government actors can fulfill. Moreover, it is a responsibility that is clearly within their mandate. Not coincidentally, such an approach clearly aligns with the European Commission’s and some European Union member state governments’ laudable competition and antitrust strategies, echoing attempts to safeguard the European market and uphold strong values throughout the EU.

      This is literally the opposite of how it works right now, with some procurement specifying AWS

    2. Such a subsidy can be aimed effectively, not at the cloud provider itself, but at companies undertaking a digital transformation or startups, giving them the freedom to choose any local provider and receiving cloud credits from their regional government. These credits are then spent locally, lifting and strengthening the local IT and digital ecosystem.

      So, basically providing access to the same capital that the cloud giants have, but raised from taxation?

    1. It’s true! None of these tactics, on their own, will address complex, deep-rooted social problems. But each of them represents a potential pathway that we can ascend when other routes are blocked.

      Useful framing in the syllabus.

      We have some idea of the goals, and talking in terms of methods provides options to suite the context

  3. Sep 2021
    1. With Amazon the sole customer of the substation it will (via Oppidan) pay for the 26 month-long design and construction process, with the exception of the City-owned control building. It is expected to cost $5,388,260 across three payment milestones, one of which has already been paid.After it is built, property rights will transfer over to SVP, which will operate and maintain the substation.

      OK. so it's not so much a substation owned like a block box.But Amazon is the sole customer, and it likely bought the site so :

      a) it would stop others making a datacentre there b) it could then make use of the substation, and providing extra distribution for the other DCs it wants to operate and use so it can expand further

    1. This verticalization will have the great flaw of making the real consumption of these infrastructures invisible. Today we can still retrieve some data from water and energy providers but when Amazon builds its own substations, like in Santa Clara, or Google its own pumping stations then the black box will continue to grow.

      I had no idea Amazon is building its own substations.

    2. At the environmental level, the territorial approach makes it possible to get out of the mystique of relative efficiency values to align consumption in absolute value with a local stock and a precise environment.

      Absolutt comsumption as a percentage of the local resources would be a huge jump forward here

    3. However, the possible unsustainability of the new data center project was outweighed by an $800 million project with various financial benefits to the community, so the construction project was voted 6-1 in the city council.

      It's worth comparing this to other water reservations for context. Comparing it to agriculture in the same area might help, to see the choices people are facing

    4. It also raises the point that data centers could crowd out renewable energy capacity on the grid, slowing down the country's energy transition.

      I think the arguent made here is that the load can exceed the generation coming from renewable sources, meaning that this would end up leading to more dirty power coming online to meet the demand.

      The alternative might be to adjust demand, with the virtual capacity curves proposed in the google paper,and supplemen that with storage

    5. Energy used in a mine, in freight, in the supply and production chain is much less likely to be renewable.

      It's worth considering things like how a CBAM a carbon border adjustment mechanism might affect this, as it's designed specifically to address this issue of high carbon intensity goods crossing country or trading block borders, like the EU

    6. The US giant advertises that its data center in Eemshaven in the Netherlands would be 100% powered by RE since its opening in 2016. However, on Google's electricity supply matrices we can clearly see that 69% of the electricity supply was provided by RE. The remaining 31% is offset by RECs or virtual PPAs. Google's statement in the preamble is therefore not factually correct.

      These might still be offset by RECs that are tied to a specific point in time, sometimes referred to as TEACS.

    7. In this scientific literature, it is estimated that the manufacturing phase (construction of the building + manufacturing of the IT equipment) represents on average 15% of the energy and GHG footprint of a data center in a country with "medium" carbon electricity (approx. 150-200gCO2/kWh).. To get to this figure, it is assumed that the building is new and will last 20 years and that the IT equipment is replaced every 4 to 5 years. Based on GAFAM's Scopes 3, a recent publication by researchers from Facebook, Harvard and Arizona University estimated that the carbon impact of data centers related to IT equipment, construction and infrastructure was higher than imagined. There is therefore a growing interest in better understanding these "omissions".

      This is a good point. Refresh rates can be closer to a 1-2 years in some hyperscalers. Good for use phase carbon, bad for embodied carbon

    1. The Commission found that the arrangement, as currently written, could result in annual revenue shortfalls ranging in the millions of dollars, which other customers would have to cover due to the credits that could completely zero-out Facebook’s bill.“The Commission noted this is not logical— that a customer could reduce its bill by using more resources,” it said.

      As I understand this, structuring this deal to give a a low cost for a loooong term agreement would mean bills would have to be raised on other rate payers to make sure the company with the monopoly is able to make the pre-agreed rate of return it as allowed to make each year.

    1. Multiply that by the 80k Server rooms and we get a staggering £4,600,000,000 per annum, (4.6Billion) or 38.54TWh or 11.37 percent of the electricity generated in the UK, a long way from the 1 percent cited earlier [although that was one percent of energy, not electricity - Editor].The CCA info (2017) revealed that the total electricty consumed by sites taking part in the Agreement was 2.573TWh which was 0.79 percent of the country's total electricity generation.So, my figures and the CCA (2017) figures total 41.11TWh representing just over 12 percent of total generation.

      In most cases this is likely a steady load across the whole year.

      41.11TWh seems incredibly high, compared to IEA figures of 200 TWh for the year, but assuming this is correct,

      We'd need to divide this by 365 for every day in the year, and then 24 for every hour to get an idea of the likely continous power draw, every hour as the infra doesn't really get turned off.

    2. After techUk’s Emma Fryer released the results of the second period of the UK data center sectors climate change agreement (CCA) 2nd Period findings in 2017, I conducted some desk-based research which looked at the issue from a UK PLC perspective and included all those enterprise data centers, server cupboards and machine rooms that are largely hidden.

      John mentioned to me the the CCA notes from 2017 might be a little out. It's worth sanity checking that.

    3. “In the UK 76.5 percent of the electricity purchased by our commercial data center operators is 100 percent certified renewable”

      This is an annualised figure, so it doesn't match the actual time of use of datacentres.

      For that we'd need to have a rough idea of how well generation matches the load profile in the UK

    1. In building this system we simultaneously solved three high-level challenges: supporting exabyte-scale, isolating performance between tenants, and enabling tenant-specific optimizations. Exabyte-scale clusters are important for operational simplicity and resource sharing. Tectonic disaggregates the file system metadata into independently scalable layers, and hash-partitions each metadata layer into a scalable shared key-value store. Combined with a linearly scalable storage node layer, this disaggregated metadata allows the system to meet the storage needs of an entire data center.

      So, it seems to add a layer of indirection, so instead of everyone needing to read off the same bits of a disk, the data is stored in places indexed by the KV store, which allows reads and writes to be spread across a linearly scaling storage layer.

      Worth reading the paper to check if this guess is close to reality

    1. The combination of Raspberry Pi and LimeSDR Mini brings with it another major advantage: cost. “The network we built in 2014 was using a software defined radio and a compute unit, but I think there, even for the cheap ones, we were looking at $1,500 for each base station. So, in four years we’ve come down probably eightfold in cost, and complexity and power and all the other stuff that goes with that, because stuff’s got physically smaller as well as cheaper.”

      Each base station for about 200 USD?

    1. Earlier this month, cleantech startup Clearloop broke ground on a 1‑megawatt solar project outside Jackson, Tennessee that stakes out a bold new definition of solar’s carbon-reduction value. It’s the first utility-scale solar project in the country to be partially financed by selling the carbon emissions it will displace over its lifetime. These transactions will not take the customary form of renewable energy credits that average out that value over time, but rather of carbon offsets that are directly related to the power grid the project is connected to.  In more specific terms, about $400,000, or roughly one-third of the project’s cost, was raised via the sale of offsets for nearly 60 million pounds of carbon emissions.

      Wow, this is so much more 'additional' than the often tokenistic measures I see with RECs

    1. The aim of the financial declaration clauses is to provide social pressure to make sure donations happen. This clause doesn't require anybody to make a donation to the DSF - there's no mandatory license fee for any use of the Django mark. However, if you want to use the Django name or logo, you do have to publicly declare that you're not giving anything back to the project. The hope is that this will provide enough social pressure to encourage some level of contribution back to the DSF.

      Social pressure tied to the trademark - useful example to refer to

    2. ... I'm not happy with the way an event/group handled my code of conduct complaint? The first line for reporting any code of conduct violation should always be the event or group organizers themselves. However, if you've done this, and you're not happy with the response you've received, contact the DSF by email: foundation@djangoproject.com and the DSF will investigate and respond. In the extreme case, this response may be to revoke the group/events license to use the Django name

      A potential escalation path for events and conferences when the denial and delay response talks come out

    3. You must also adopt a code of conduct (the Ada initiative draft is a good starting point, but you can choose another code if you wish), and agree to run your event in the spirit of the Django Community code of conduct.

      This is good example of trademark law being used to enforce norms around codes of conducts

    1. Originally, Google was building flying cell towers to beam down the Internet from the sky (over RF), but for balloon-to-balloon backhaul, the company was planning communications via laser beam. Space X just started doing something similar by equipping its Starlink satellites with space lasers for optical intra-satellite communication. One benefit of Sky- and space-based laser communication is that not much can interfere with a point-to-point optical beam. Ground-based lasers have more interference to consider, since they have to deal with nearly everything: rain, fog, birds, and once, according to Alphabet's blog post, "a curious monkey."

      a 'curious monkey' caused an internet outage by blocking the beam? outage

    1. For example, a 1:1 HD 1080p video meeting of 1 hour between two people would require 3.24GB of bandwidth, consuming 0.0486 kWh of electricity. The 2019 UK electricity emissions factor is 0.25358 kgCO2 per kWh, so the CO2 emissions for this call are 0.012 kgCO2. If this happened in the US between two people in New York, the emissions factor is similar to the UK at 0.28839 kgCO2 per kWh but if it was between two people in Chicago, that would be 0.56191 kgCO2 per kWh. Location matters.

      These figures are for 2019.

      The infra has got more efficient, in the two years since 2019, and the grid has also got greener, so the number is likely lower as well now in 2021.

    1. My estimate of 36gCO2 per hour is more than 2,100-times lower than Marks et al. (2020) who estimated that 35 hours of HD video emits 2.68tCO2, or 77kgCO2 per hour.

      This is a figure for netflix, which is likely lower than a zoom call.

      With a video on netflix, the infrastructure is designed to stream a file that's already been encoded.

      This is different to something zoom, where the infrastructure is on the fly encoding the live video streams coming from the camera, and if need be re-encoding them to suit the device

    2. a result, the central IEA estimate for one hour of streaming video in 2019 is now 36gCO2, down from 82gCO2 in the original analysis published in February 2020.

      By comparison an espresso coffee is around 280g CO2e. So if even if we use the high figure from the shift project, it's still three hours of video for around the carbon footprint of a cup of coffee.

    1. t’s not actually going to be a standard, per se, because you can’t pass regulatory standards through reconciliation. Instead, it’s going to be a system of fines and payments that will incentivize utilities to increase their proportion of renewable energy to meet the targets. It’s called a clean electricity payment program (CEPP). A CEPP actually has some advantages over the traditional CES’s and renewable portfolio standard (RPSs) commonly seen in states. For one thing, it’s more progressive: the money to drive the transition comes from federal coffers (via taxes on corporations and the wealthy) rather than from electricity rates, which are regressive.

      If you are paying for transition from taxation like this, because it's largely coming from richer members of society it's more progressive than tacking the charge onto every kilowatt used by consumers, which disproportionately affects lower income groups

    1. With a drink containing approximately 18 g of green coffee (Starbucks Coffee Company, 2019), each kg of green coffee makes approximately 56 espresso beverages. Thus, the carbon footprint found in the LCA is on average 0.28 and 0.06 kg CO2e per espresso beverage for conventional and sustainable coffee, respectively (9.2 and 2.1 g CO2e ml–1). In an LCA of milk production, Hassard et al. (2014) estimated a carbon footprint of 2.26 g CO2e ml–1. Using these values, the carbon footprint of standard coffee beverages was estimated: with the conventional production of coffee beans, the carbon footprints for one serving of caffe latte, flat white, and cappuccino were estimated to be 0.55, 0.34, and 0.41 kg CO2e, respectively. When produced sustainably, these values were reduced to 0.33, 0.13, and 0.20 kg CO2e

      These are the figures cited in Phys, and what you might compare to the streaming to get an idea of where losing an 3 hours of netflix is more important to you than skipping that next coffee

    1. Renewable energy surcharge (21%) Finances the feed-in tariffs for renewable power and the market premium paid to larger producers - 6.41 ct/kWh.

      About 20% of every kilowatt hour in germany goes towards the renewables surcharge. So the greater percentage of your earnings that you spend on electricity, the more you are contributing to renewables compared to someone who earns more and pays a lower amount.

    1. In the 2014 Radio Equipment Directive, EU lawmakers called for a common charger to be developed and gave the Commission powers to pursue this via a delegated act. The Commission's approach of “encouraging” industry to develop common chargers fell short of the co-legislators’ objectives. However, some progress has been made, said the Commission in the plenary debate on 13 January 2020: in 2009, there were more than 30 charging solutions, while today there are three charger types. In its resolution on the European Green Deal, Parliament called for an ambitious new circular economy action plan aiming to reduce the total environmental and resource footprint of EU production and consumption, with resource efficiency, zero pollution and waste prevention as key priorities.

      This is why I say expecting end users to just shop greener is missing a key part of the picture. The ability to compel an industry to standardise on a smaller number of chargers has saved an immense amount of waste, as it decouples the thing you want (shiny phone) from the thing you need so it can be used (charger).

    1. This article derives criteria to identify accurate estimates over time and provides a new estimate of 0.06 kWh/GB for 2015. By retroactively applying our criteria to existing studies, we were able to determine that the electricity intensity of data transmission (core and fixed-line access networks) has decreased by half approximately every 2 years since 2000 (for developed countries), a rate of change comparable to that found in the efficiency of computing more generally.

      this is a figure from 2017, but the halving has been going for 20 years. There are signs of it slowing down, but not by much.

    1. As the graph above shows, renewables will need to do the heavy lifting, growing even faster than in recent years. By 2050, says the IEA, around 90 per cent of global electricity supply will need to be low carbon, about 70 per cent from solar and wind power, with the rest mostly from nuclear.

      this is the edited version of the data in the IEA 1.5 C net zero report in May

    1. The average cup of coffee contains about 18g of green coffee, so 1 kg of it can make 56 espressos. Just one espresso has an average carbon footprint of about 0.28 kg, but it could be as little as 0.06 kg if grown sustainably.

      This is the figure I use when comparing the quoted figures for video streaming

    1. But I'm wondering if VC-backed firms should be excluded from projects that require long-term maintenance because their growth requirements (fiduciary duties to investors, aka "we need a hockey stick") means they can't commit to 10-20 year contracts?

      Actual serious point. We've had 20-odd years of seeing the incentives at work, and they're not always good.

    1. 2020 is the year in which the current Dutch subsidy scheme for renewable energy, the Renewable Energy Production Incentive Scheme (de stimuleringsregeling duurzame energieproductie (SDE+)), will change. From 2020 onwards, the SDE + will be broadened to achieve the target of a 49 percent reduction in CO2 emissions in the Netherlands by 2030 (or at least to keep this goal within reach). The broadened SDE+ focuses on the reduction of greenhouse gas emissions (CO2 and other greenhouse gases). This will change the focus from energy production to energy transition. The broadened subsidy scheme is therefore called the Renewable Energy Transition Incentive Scheme (SDE++).

      So, this is the expanded version that is focussed on a more holistic, systemic approach

  4. Jun 2021
    1. Air pollution caused by the burning of fossil fuels such as coal and oil was responsible for 8.7m deaths globally in 2018, a staggering one in five of all people who died that year, new research has found.

      This is not climate change, heat related deaths - this is particulate matter (PM2.5) from burning the fuels themselves

  5. Feb 2021
    1. ClimateTech Jobs Fair: CAT members Terra.do, in addition to being one of the first orgs to donate to CAT, are organising a virtual jobs fair on March 5th. Register to have live 1:1 conversations with hiring managers on for jobs in software, data science, product management, hardware and more at top climatetech companies.

      Terra do's plug for their job fair.

    1. While itis criticalto enhancethe sustainability of digitalisation, the ICT sector’s estimated potential in the reduction of GHGs is ten times higher than its own footprint

      Source?

    Annotators

  6. Jan 2021
    1. Currently, there is a significant lack of transparency of environmental cost, which should be urgently resolved given the vast scale of resource usage. Therefore, NGIatlantic.eu invites EU – US applicants that can provide and experiment with transparency mechanisms on the environmental cost of the Internet. Identification and tagging of most resource consuming elements are also very important and urgent. On both sides of the Atlantic, there has already been some early research and innovation projects and initiatives focussing on alternatives to improving energy efficiency to ensure the greening and sustainability of the Internet and of the economy relying on it. This topic welcomes the results from these EU activities to team up with US teams (or vice versa, with US teams twinning with EU teams) to carry out experiments in this vitally important NGI topic.

      Interesting. I couldn't see many in the link though.

    1. That will allow the utility to match in real-time specific units of renewable energy generation with Microsoft’s usage – and dispatch energy from storage if there’s a shortfall – to ensure Microsoft is continually supplied with renewable energy, Janous said.

      They ARE pairing it with batteries

    1. Typically, a data center is seen as an inflexible load, or a “single block” of power, she said. In other words, if the facility’s total load is 10MW, the conventional approach is to ensure there’s 10MW of backup power. Google has learned to not treat total load as an inflexible block and match backup capacity more tightly with the capacity required by applications that run in the facility and the duration for which it’s required.

      This is a good summary of carbon-aware compute.

    2. Also this year, Microsoft announced a successful test for powering a data center rack with a hydrogen-fueled automotive fuel cell, looking at the technology as one of the potential replacements for diesel generators.

      Bosch sell these now, or are making them.

      https://www.bosch.com/research/know-how/success-stories/high-temperature-fuel-cell-systems/

      There's also a collosal amount of EU money in creating a hydrogen economy, so it feels like there conditions are good for switching.

    3. Google estimates that the total generation capacity of all diesel-fueled data center backup generators deployed worldwide is more than 20 gigawatts, which could spell vast opportunities for renewable energy storage.

      20 gigawatts needed to replace all fossil generators worldwide.

      Not sure how long diesel generators last - 12 hrs? 24hrs?

    1. But proponents of immersion cooling technologies emphasize their efficiency advantages. These solutions don’t require any fans. Server fans are usually removed altogether. “You can probably get at least a 15-percent energy reduction in a lot of environments by going to liquid cooling,” Brown said.

      I didn't know that liquid cooling was more energy efficient than air cooling. I also didn't really think about it much either, but it make sense. Liquid is a better conductor of heat than air in cooking, so…

    2. As Uptime’s Lawrence pointed out, generators are “a problem.” They are expensive to deploy and maintain, they pollute the atmosphere, and make a lot of noise. So far, however, they’ve been an indispensable part of any data center that’s meant to keep running around the clock.

      Wow, I didn't know the direction of travel away from diesel was so pronounced in DCs. Makes sense tho.

    1. Perhaps the most critical feature of a decentralized UPS architecture is it allows data centers to meet increased demand by deploying equipment-specific resources that can scale according to needs. New cabinets can expand capacity as necessary, while additional rectifiers and battery modules can increase power for servers added to open racks. By relying on DC power components that connect directly to the AC utility feed, a decentralized power architecture allows facilities to operationalize stranded white space and maximize infrastructure without placing any additional strain on their existing UPS system.

      Oh wow, so they decentralise INSIDE the datacentre. Would this mean it's would be easier to move loads (as in power usage) around in the DC, and power down entire sections more easily?

  7. Nov 2020

    Tags

    Annotators

    1. As a leader in the European data centre industry, we strive to set an example of environmental responsibility. In addition to innovations in engineering and diligent operations for maximising energy efficiency, Interxion supports and consumes energy from sustainable and low carbon sources to the greatest practical extent in our markets of operation. A large proportion of our power comes from sustainable sources, including water, solar and wind.

      A large proportion is a bit vague, and not the same as 100%.

      If they're part of the same group, as Digital Realty, they should have these numbers as DR's report includes total powered used, vs total power from renewable sources, and there are clear ways to confirm this if true.

    1. We’re now 100% powered by renewable and sustainable energy which is great in further minimizing our impact on the planet. Plausible Analytics script weights less than 1 KB which is more than 45 times smaller than the recommended Google Analytics Global Site Tag implementation.

      After speaking to the folks at Plausible they pointed me to this page on the digital ocean community forums:

      https://www.digitalocean.com/community/questions/what-kind-of-electricity-do-you-run-on

      And this one here:

      https://www.interxion.com/why-interxion/sustainability

      The TLDR version is that the servers they are using are run by Digital Ocean, who lease from Interxion, who source the power for the datacentre from renewables.

      Interxion themselves are owned by Digital Realty, who do release figures, but not at a granularity to confirm.

      Once there is info from Interxion, it's possible to confirm this.

    1. ours of session time in each region is multiplied by estimated powerconsumption of devices and device breakdown (% laptop, % tablet,etc.) used to access Mozilla products. This calculation produces totalannual kWh by region. Regional electricity emissions factors areapplied to calculate total regional emissions and summed to calculatetotal emissions by product

      Yup. the more I look at this the more it doesn't seem to account for transfer.

      I can see why you might not account for this, and leave it outside the system boundary, but I can't see the how the logic for including hardware, but not including network usage, then it's likely to be material.

    2. Technology energy consumption used to access Mozilla products,broken down by type and power consumption: desktop computer,laptop computer, tablet, mobile, modem/router.

      Looking at this, it seems not to account for emissions from network transfer, which would be a key use for a browser - just end use.

    3. Server allocation per $ spend: 0.001 server / $ spend

      So, put another way for every ~10k of spend, you assume you're using one whole physical server's worth of compute.

    4. Sum of space-specific refrigerant leak in kg by refrigerant type,divided by 1,000, multiplied by refrigerant type-specific globalwarming potential (GWP) to calculate organization total GHGemissions in mtCO​2​e.

      This gives an idea of why fixing refrigeration ranked such a high intervention on drawdown. Many of these refrigerant gases are utter carbon bastards, and really bad news. one kilo released has a similar impact as 1.3 tonnes of 'regular' CO2!

    5. Mozilla uses “business services and operations”to refer to the organization’s calculated Scope 1, 2 and 3 emissions with the exception of Scope3 Use of Products.

      14k tonnes would be the footprint of Mozilla if they didn't account for end user products.

    6. Use of Products = 98% ofScope 3 total

      Wow, this a huge percentage. It's pretty rare of orgs to report end user emissions as well, but it highlights the important of setting sensible defaults.

    7. Mozilla’s 2019 GHG inventory is comprised of emissions from scope 1, scope 2 and relevantscope 3 categories.

      It's surprising how many companies don't do all three scopes. It's good to see the effort gone in.

    Annotators

  8. Oct 2020
    1. We have: defined procurement principles and standards. These are (in summary): 100% renewable energy and/or carbon neutral suppliers 0% to landfill and an annual increase in reuse and material recycling increased transparency across HMG, suppliers and the supply chain 100% traceability of ICT at end of life a yearly increase in procured ICT and services that is remanufactured/refurbished

      Pretty explicit

    1. I'm really curious whether or how creating more effective car battery recycling technologies could support or underwrite consumer electronics recycling. For years, the line on e-waste has been that the economic model for electronics recycling just doesn't work. As devices get smaller, extraction gets more time-consuming and with increased component miniaturization, recycling companies are getting less and less value from what metal they can recover (although notably, Redwood Materials is refining their battery recycling process through recycling consumer electronics). Recycling car batteries at scale fills a massive need and companies will definitely pay for those materials. If Redwood can get it right, they could potentially make a less-profitable recycling niche more viable with the resources afforded by its primary more profitable niche.

      Basically, ways to reclaim existing materials in the technosphere, than need to mine stuff

  9. Sep 2020
    1. In addition, the recent need to accelerate deep-learning and artificial intelligence applications has led to the emergence of specialized accelerator hardware, including graphics processing units (GPUs), tensor processing units (TPUs) and field-programmable gate arrays (FPGAs). Owing to its in-memory data model, NumPy is currently unable to directly utilize such storage and specialized hardware. However, both distributed data and also the parallel execution of GPUs, TPUs and FPGAs map well to the paradigm of array programming: therefore leading to a gap between available modern hardware architectures and the tools necessary to leverage their computational power.

      Ah, so it whie it supports SIMD, it doesn't support this stuff yet.

    2. NumPy operates on in-memory arrays using the central processing unit (CPU). To utilize modern, specialized storage and hardware, there has been a recent proliferation of Python array packages

      So these can drop down to take advantage of SIMD and all that?

  10. Aug 2020
    1. his dream of it being as easy to “insert facts, data, and models in political discussion as it is to insert emoji” 😉 speaks to a sort of consumerist, on-demand thirst for snippets, rather than a deep understanding of complexity. It’s app-informed, drag-and-drop data for instant government.
  11. Jun 2020

    Annotators

    1. The set-up of a cloud services marketplace for EU users from the private and public sector will be facilitated by the Commission by Q4 2022. The marketplace will put potential users (in particular the public sector and SMEs) in the position to select cloud processing, software and platform service offerings that comply with a number of requirements in areas like data protection, security, data portability, energy efficiency and market practice.

      Best source of compute, but across the entire EU?

  12. May 2020
    1. In France, as in other Western countries, for various reasons too long to mention in this study, the collection level of waste electrical and electronic equipment is around 45%.

      This is higher than I thought - almost half of all electronics are collected in France?

    2. Had they been implemented as of 2010, these 4 measures would have reduced the global digital footprint over the period observed (2010 to 2025) by between 27% and 52%

      How would you back this claim up, or disprove it? It feels like catnip for journalists, but I don't see how you could interrogate it.

    3. Reducing the number of flat screens by replacing them with other display devices such as augmented / virtual reality glasses, LED video projectors, etc.

      Presumably this would be down to the energy intensive nature of making a flatscreen?

    4. Their contribution to the impact of the digital world thus goes f rom less than 1% (all environmental indicators combined) in 2010 to between 18% and 23% in 2025. It is huge!

      These numbers seem comically high

    5. In 2025, user equipment will concentrate from 56% to 69% of the impact. For example, 62% of digital GHG emissions will be user-related, 35% of which comes from equipment manufacturing.

      What kind energy mix does this assume in 2025?

    6. The hierarchy of impact sources is as follows, in descending order of importance:1. Manufacturing of user equipment;2. Power consumption of user equipment;

      So these guys say it's the end use that's the big problem, not the infrastructure or network usage per se.

    7. In 2019, the mass of this digital world amounts to 223 million tonnes, the equivalent of 179 million cars of 1.3 tonnes (5 times the number of cars in France).

      So this might be a bit like the idea of biomass, but for technology?

    1. We've likely cleared half the biomass of plants on earth, and after that, they make up more than 80% of all the biomass of everything alive today.

      The livestock we farm is likely around 10 times the biomass of all wild mammals and birds

    1. This document gives a good background on where you might choose to use LCA, and where it's not such a useful tool.

      It's from 2012, but what it's saying is mostly in line with my understanding of the subject.

      Things have moved on since it was written and the open source projects seem not to be all that active now.

    2. Analysis of a web-search of published LCA study results for ICT devices showing percentage use-stage carbon. Source: Darrel Stickler, Cisco

      This diagram is v handy - at a glance, it gives an idea where the main levers for reducing might be depending on the kind of product

    3. This framework tends to follow a similar pattern which includes most or all of the following steps: set goals and define scope, inventory analysis, impact assessment, interpretation, reporting and critical review.

      This is a really helpful diagram for explaining what the alphabet soup of standards means, and how you might apply them.

    4. Even the OECD definition of ICT includes a whole range of consumer electronic (CE) products that many would not expect to see classed as ICT.

      Where is the official definition in the OECD?

    5. And with carbon, as with calories, it is the pies –or rather their carbon equivalents -that are important. LCA is a poor tool for differentiating between strawberries and raspberries, but it is a wonderful tool for identifying where the pies are, and who is eating them

      This is totally worth using in future

    Annotators

    1. 1,211,22

      This is the final figure for scope 1, 2, and 3, after accounting for market based figures for CO2 from energy.

      They've split out a huge chunk of emissions in scope 3 as *other, but it's not immediately obvious to me what this includes, and they've chosen not to account for it when it comes to purchasing the offsets anyway.

    1. The action was the digital equivalent of queueing up at McDonalds and ordering the non-existent vegan, zero-waste Happy Meal again and again. Rebels targeted a different polluter each day, including fossil fuel companies Shell and BP, shipping company Maersk, and the Danish Finance Ministry for its recent bailout of Scandinavian Airlines.

      This sounds a lot like DDOSing websites to me.

    1. Technological improvements also play a role in driving down battery costs. Frith points out that the term lithium-ion battery is actually an umbrella term for a number of different battery chemistries.

      Did not know this.

    1. The Oil Climate Index from the Carnegie Institution gives full lifecycle emissions for a selection of global crude oils. That analysis does not specifically profile Permian crude oils, but does find that the median U.S. crude oil will lead to 0.51 metric tons of CO2-eq over its lifecycle.[

      Are these figures the updated ones where life cycle figures for fossil fuels turn out to be higher than previously thought?

    2. Public commitment to no longer offer machine learning or high performance computing capabilities for the oil and gas sector for the purpose of new exploration or increased production, and to not renew existing contracts. End membership with the Open Subsurface Data Universe Forum.

      If you were a company doing this, what would the wording look like?

    3. In 2018, Google attracted former President and General Manager of BP, Darryl Willis, as VP of Oil, Gas, and Energy at Google Cloud, where he was tasked with developing new products and solutions and building trusted relationships with key leaders and companies in the oil and gas sector

      VP of Oil, Gas and Energy

    4. o realize the climate commitments they have set, Google, Microsoft, and Amazon must continue to reduce carbon emissions throughout their own operations and publicly distance themselves from customers that are making the climate crisis worse.

      It's crazy that this would be seen as controversial, and yet…

    1. With the backing of Green Alliance and some philanthropic funders, I set up a training programme. We offered parliamentary candidates and new MPs the chance to learn about the science, policy and politics of climate in a series of tailor-made workshops. We worked with small groups of around 10 politicians, all from the same party, to allow them to question and debate freely.

      I wonder how much this cost, to design these kinda of workshops?

    1. Strengthening: The team understands its role in the larger organizational system and actively works to make that system more successful.

      Ah, so strengthening refers to an outward use of the term, not the team itself getting stronger per se.

    2. Although you may be tempted to hire new employees to fill the gaps, it’s usually more effective to include employees who already understand your business’s unique priorities and constraints.

      So, basically they propose embedding a domain expert in an existing team, as it's easier to propose than hiring a new role

    3. Each fluency zone brings new benefits, so it may seem that the model should be treated as a maturity model, in which the goal is to reach maximum maturity. That would be a mistake. Unlike maturity models, where more mature is always better, the fluency model describes a collection of choices. Each zone represents a fully-mature choice. Each one brings value.

      Using fluency instead of maturity - less of a value judgement, and more of a deliberate decision. no one wants to admit to being immature, but admitting to not being fluent is easier

    4. Although teams develop proficiencies in any order, even from multiple zones simultaneously, we’ve observed that teams tend to gain zone fluency in a predictable order.

      So, there's a fairly clear order of where to start.

    5. Focusing teams produce business value. Delivering teams deliver on the market cadence. Optimizing teams lead their market. Strengthening teams make their organizations stronger

      Focussing, Delivering, Optimising, Strengthening. How

    1. Once done with injecting my performance marks inside my HTML, I switched to the “Performance” tab, made sure I selected a “Fast 3G” network and “4x slowdown” for the CPU

      It's worth checking the profile on sitespeed.io to see how this compares

    2. Since I only wanted to see the CSS coverage, I used the filter “.css” and what I could see was that 92% of the CSS I was loading was not used. (Unused bytes will change in real-time when you start interacting with the page):

      Is this exposed in sitespeed?

    3. So, I decided to use some custom metrics using the Performance API to get a rough idea of what was time-consuming on the page I was auditing.

      So, using the performance API to translate it into something meaningful for clients

  13. Apr 2020
    1. For example, the GUI tool for PostgreSQL administration, PGAdmin 3, is used by many people.  (I’m an old-school Unix guy, and thus prefer the textual “psql” client.)  I’ve discovered over the years that while PGAdmin might be a useful and friendly way to manage your databases, it also automatically uses double quotes when creating tables.  This means that if you create a table with PGAdmin, you might find yourself struggling to find or query it afterwards.

      Oh my god. I wish I knew this ten years ago.

    1. Averylarge data centre may consume 30GWh of power in a year, costing its operator around £3,000,000 for electricityalone. A handful of sites in the UK consume even more than this although the majority of sites consume far less. The total power demand of the UK data centre sector is between 2-3TWh per year2. Energy is usually the largest single element of operating costs for data centres, varying from 25-60%.

      This is about one percent of UK electricity usage. And this seems to discount smaller datacentres, which make up a much larger use of power is the datacentre dynamics piece from john booth is anything to go by.

      https://www.datacenterdynamics.com/en/opinions/data-centers-reaching-net-zero-in-the-uk/

    1. In this case, establishing a direct connection between thepeers seems to be impossible, and the only remaining op-tion is to use a server with a public IP address to proxythe communication between the peers, e.g. using the TURNprotocol [18]

      This is just like our experiences with video again then. Back to TURN.

    2. For ephemeral updates PushPin uses an additional mes-saging channel, adjacent to the CRDT, which ties arbitrarymessages to a device and user context. The current implemen-tation is rudimentary: ephemeral data is not associated witha particular CRDT state and is distributed only over directP2P connections. Nevertheless, it enables shared contextualawareness in the user experience of PushPin, providing afeeling of presence when other users are online or collabo-rating

      So THAT's how they do the presence like mouse pointers and the rest.

      If there's already a channel here, presumably you could do video too, if there was a central server

    3. Each URL also includes acontentTypeparameter, whichindicates how that document should be rendered in the userinterface. This parameter is part of the URL, not the docu-ment content, because the same document content may berendered differently in different contexts. For example, Push-Pin could be extended to support flashcards for languagelearning. In one context, the document containing the data-base of flashcards could be rendered as a list of entries, whilein another context it might be rendered as a quiz interface,presenting one side of one flashcard at a time.

      I had no idea that there was a separate 'viewer' concept. Neat

    1. People make light of the idea that digital should be the most basic of Maslow’s hierarchy of needs — over food, water, shelter, and warmth — but there is evidence that people do, to an extent, prioritise connectivity over food and comfort. Some refugees, for instance, are known to have asked for Wi-Fi or charging services ahead of food or water on arrival in a new country.

      🤯

    1. Globally, Amazon has 86 solar and wind projects that have the capacity to generate over 2,300 MW and deliver more than 6.3 million MWh of energy annually—enough to power more than 580,000 U.S. homes.

      The use of "has the capacity to" here is misleading.

      Capacity factors for renewables range betweeen 15 and 40, so this makes it sound like amazon has paid for and used 6.3 MWh of power when it's likely to be a fraction of this.

      By comparison, Microsoft, report on how much power they did use each year, and how much of that was renewable. Last year they reported using around 7m MWh of power, and buying almost this figure in renewable energy.

  14. www.gitops.tech www.gitops.tech
    1. Additionally the image registry can be monitored to find new versions of images to deploy.

      Ahh, so this is how you might make it possible for a developer to just push changes, and not have to have access to the environment repository.

    1. For now, suffice it to say that Tailscale uses several very advanced techniques, based on the Internet STUN and ICE standards, to make these connections work even though you wouldn’t think it should be possible. This avoids the need for firewall configurations or any public-facing open ports, and thus greatly reduces the potential for human error.

      I wonder how this relate to VOIP and so on?

    2. Here’s what happens: Each node generates a random public/private keypair for itself, and associates the public key with its identity (see login, below). The node contacts the coordination server and leaves its public key and a note about where that node can currently be found, and what domain it’s in. The node downloads a list of public keys and addresses in its domain, which have been left on the coordination server by other nodes. The node configures its WireGuard instance with the appropriate set of public keys.

      So it's a little bit like a private DNS server?

    3. Sadly, developers have stopped building peer-to-peer apps because the modern Internet’s architecture has evolved, almost by accident, entirely into this kind of hub-and-spoke design, usually with the major cloud providers in the center charging rent.

      This is such a quote.

    1. A breadcrumb in this case is a single pixel that you can place in a precise location on a webpage. Placing a breadcrumb could be as simple as Option + click. While navigating the web, you could leave breadcrumbs on different pages you find interesting over the course of a browsing session. When you're done, that sequential "trail of breadcrumbs" would be saved. You could then jump back into the trail and navigate "forward" and "backward" through the things you found interesting in that browsing session. Or share the trail with a friend, and they could step through your spatial path of navigating the web.

      This isn't a million miles away from how hypothesis allows you to annotate specific sections, although much more lightweight.

      It's plausible to show a set of thumbnails of a pages with the highlights ... highlighted for others to see.

    2. When you run a standup at a technology company, you typically go around a circle and each person gives their daily update. But in Zoom, there is no circle. You get confused about who is next. Two people start speaking at the same time. It's awkward and confusing. Eventually, you realize that one person, probably the manager, just has to dictate who goes next, at the risk of seeming bossy.

      Just having a line from left to right would be an improvement, or some consistent, implied order

    1. Since emission reduction projects registered under crediting programmes to date have been mostly developed in the context of cost-saving, rather than ambition-raising mechanisms, we understand that there are very few, if any, examples of existing credited projects that represent those high-hanging fruits, and which could be considered truly additional in the context of the Paris Agreement. Given the difficulty in objectively determining additionality in line with this definition, we consider that only a niche and ever reducing number of activities could count for this, and that this does therefore not represent a viable option for rapidly increasing demand volume of the market.

      This is a really important point, that the current "offsets" framing undermines.

    2. A climate responsibility approach needs to first and foremost incentivise and facilitate the reduction of one’s own emissions.

      I can't help thinking they should have lead with this

    3. Emissions from project-specific activities, such as project-related travel, are attributed as cost items to their respective project cost lines.

      Project level carbon budgeting?

    4. We recognise that some of the activities with the highest transformation potential – and therefore with high suitability for supporting the objectives of the Paris Agreement – may be at early stages of development and/or may carry a risk of not delivering attributable emission reductions.

      This addresses some of the issues around needing to get a definite amount of emissions drawn down, when the science makes this very very hard to measure

    1. The High-Level Commission on Carbon Prices surveyed the available scientific literature, concluding that the explicit carbon-price level consistent with the Paris Agreement temperature objectives is at least US$40–80/tCO2 by 2020, provided that a supportive policy environment is in place (High-Level Commission on Carbon Prices, 2017). Informed by this report and allowing for its uncertainties, NewClimate Institute has imposed a price level of EUR 100/tCO2e for the 2014-2019 period. This is also in-line with the central estimate of climate change avoidance costs over the period to 2030 used in the European Commission’s 2019 Handbook on the External Costs of Transport (European Commission, 2019).

      Holy biscuits. They're not fucking about. Microsoft just increased their price for carbon to 14 USD per tonne, by comparison.

    2. This methodology for the estimation of GHG emissions includes the estimated equivalent climate impact of non-carbon climate forcers from aviation, such as condensation trails, ice clouds and ozone generated by nitrogen oxides and results in emission estimates approximately three times greater than if calculating only direct CO2 emissions (Atmosfair, 2016).

      Wow, they use the updated science

    1. For the moment, the main take-away is that there is a goodargument that registration fees should ​not​​be set at $0, even this year. Rather, organizersshould look at their existing budgets, and rework them by eliminating the costs associated withthe physical event. Virtual conferences that do choose to set their prices low should be carefulnot to encourage an expectation that other virtual conferences (or future instances of this one)will always be free or cheap. (For example, one of the suggestions heard by the ASPLOSorganizers was “keep it free;” obviously, this may not be financially sustainable.)

      I'm really glad this is mentioned, as it provides a chance to talk about actually paying presenters for their time, increasing the likelihood of new voices speaking at conferences.

  15. Mar 2020
    1. He mentioned a few examples where the AI fallacy is already playing out. One is the idea from the national statistician that we might not need to keep doing a census, because there will be lots of data from other sources. Neil pointed out that we actually need more classical statistics than ever, to verify all this machine learning and data. He calls this the “Big Data Paradox” - that as we measure more about society, we understand less. We need to be able to sanity check our large complex systems - the census is still valuable.

      The census as a calibration tool

    2. In the section on technical debt, the mythical man month, etc, I was amused to note that Neil called out Amazon's fabled "two pizza team" as American cultural imperialism. The problem arising from the separation of concerns and specialisation of teams is that no one is concerned with the whole system.

      This is such a good quote and helps explain so much.

    1. GENEVA (Reuters) - European countries need to invest to prepare their transport infrastructure for the impacts of climate change or face hundreds of millions of dollars in repair costs, a U.N. regional commission said in a study it says is the first of its kind.

      This is for transport. What about networks?

  16. www.fairphone.com www.fairphone.com
    1. GWPkg CO2e43.8535.98-1.115.983.00100.0%82.1%-2.5%13.6%6.8%

      These figures here, for a smart phone from 2016 put the use phase at around 6kg of CO2 over a 3 year life cycle.

      That's fairly close to the Shift Project figures in their Lean ICT report.

    1. The quantification of this unit impact is done in kWh/byte. Three contributions are considered:The electricity consumption associated with using the terminal on which the action is performed;The electricity consumption generated by the activity of the data centers involved in transferring the data;The electricity consumption generated by the activity of the other network infrastructures during the transfer of the data.

      Awright. So, reading this makes me think all the numbers from the 1byte model that are being cited appear to be concerned with the use phase, not the actual production phase.

  17. Feb 2020
    1. Data centres and telecommunications will need to become more energy efficient, reuse waste energy,anduse more renewable energy sources. They can and should become climate neutral by 2030

      For maddie

    2. Destination Earth, initiativeto develop a high precision digital model of Earth (a “Digital Twin of the Earth”) that would improve Europe’s environmental prediction and crisis management capabilities (Timing: from 2021

      Good heavens, a digital twin of Earth

    3. A circular electronics initiative, mobilising existing and new instruments in line with the policy framework for sustainable products of the forthcoming circular economy action plan

      I wonder if the restart project gang have seen this

    1. Climate change poses an unprecedented threat to humanity in the 21st century. In the period up to 2030, an estimated $3.5 trillion is required for developing countries to implement the Paris climate pledges to prevent potentially catastrophic and irreversible effects of climate change.

      Ah, that's where it comes from

    1. Methodology and sources The analysis of the carbon intensity of streaming video presented in this piece is based on a range of sources and assumptions, calculated for 2019 or the latest year possible. Bitrate: global weighted average calculated based on subscriptions by country and average country-level data streaming rates from Netflix in 2019; resolution-specific bitrates from Netflix.Data centres: low estimate based on Netflix reported direct and indirect electricity consumption in 2019, viewing statistics and global weighted average bitrate (above); high estimate based on 2019 cloud data centre IP traffic from Cisco and energy use estimates for cloud and hyperscale from IEA.Data transmission networks: calculations based on Aslan et al. (2017), Schien & Priest (2014), Schien et al. (2015), and Andrae & Edler (2015), and weighted based on Netflix viewing data by devices. Devices: smartphones and tablets: calculations based on Urban et al. (2014) and Urban et al. (2019), iPhone 11 specifications (power consumption and battery capacity), and iPad 10.2 specifications; laptops: Urban et al. (2019); televisions: Urban et al. (2019) and Park et al. (2016), and weighted based on Netflix viewing data by devices. Carbon intensity of electricity: based on IEA country-level and global data, and 2030 scenario projections.

      Let's roll these into the new model

    1. Aristotle’s view of the process: “We are what we repeatedly do. Excellence, then, is not an act but a habit.”

      This is a really handy quote. totally borrowing it.

    2. Buying carbon offsets might still have greater impact in the short run, but you can’t see them, so their purchase is less likely to be contagious.

      I've never thought of them in terms of 'contagion' like this. interesting

    1. Today the average carbon intensity of electricity generated is 475 gCO2/kWh, a 10% improvement on the intensity from 2010.

      There is more recent data, but it's not for the whole world. Should come out around March 2020.

    1. It is relevant to estimate how much data is generated by - and associated electric power used - normal behavior like video streaming several hours every day.

      If we assume video streaming is 'normal behaviour' like web surfing and working online, then these numbers are probably the safest to use for now.

    Annotators

    1. TechnologyIntensity [kgCO2eq/MWh]solar0.00410geothermal0.00664wind0.141nuclear10.3hydro16.2biomass50.9gas583unknown927oil1033coal1167

      The proportions of each are going affect this.

      What would a global average (mean) figure be for renewable energy (i.e. not including nuclear)?

    2. Fig. 1. The 28 areas considered in this case study, and the power flows between them for the first hour of January 1, 2017. The width of the arrows is proportional to the magnitude of the flow on each line. Power flows to and from neighboring countries, e.g. Switzerland, are included when available, and these areas are shown in gray. The cascade of power flows from German wind and Polish coal are highlighted with blue and brown arrows, respectively.

      I had no idea Germany sold so much power, net to other countries. Always assumed it bought loads of France's power

    1. So energy consumption in the oil and gas sector, into which Google Cloud is selling its cost-reducing services, is four orders of magnitude larger than Google’s data center decarbonization efforts. The harm that Google Cloud will do to the planet, if it reduces underlying costs of this industry by even a small percent, completely dwarfs the data center decarbonization work.

      This is the first time I've seen numbers putting these into perspective. I wonder how these compare for Amazon and M$ ?

    1. Therefore, we estimate an advertising share of 50% for the traffic class web, email, and data in 2016, with an uncertainty range of [25%–75%]. The share is the same for both mobile and fixed traffic.

      Holy biscuits, HALF OF WEB TRAFFIC as ads?

    2. According to a Solarwinds company 2018 study, the average load time for the top 50 websites was 9.46 s with trackers and 2.69 s without.

      Trackers have a 4x impact on performance compared to not having them

    3. The Internet's share of the global electricity consumption was 10% in 2014 (Mills, 2013): As a reference, the entire global residential space heating in 2014 consumed the same amount (International Energy Agency, 2017a).

      Heating ALL the homes in all the world is about the same as the internet's carbon footprint according to this paper

  18. Jan 2020
    1. . In particular,on the host, we propose a first configuration of a software-defined power meter that builds on a new CPU power modelthat accounts for common power-aware features of multi-core processors to deliver accurate power estimations at thegranularity of a software process. In the VM, we introduce asecond configuration of a software-defined power meter thatconnects to the host configuration in order to distribute thepower consumption of VM instances between the hosted ap-plications. The proposed configuration can even be extendedto consider distributed power monitoring scenarios involvingapplication components spread across several host machines.

      So, this would be the app level metering you would want.

    2. WhileBITWATTSis a modular framework that can accom-modate different power models (includingrunning averagepower limit(RAPL) probes and power meters), we propose aprocess-level power model, which is application-agnostic andaccounts for virtualization—i.e., for emulated cores withina VM—and for the power-aware extensions of modern pro-cessors, notably hardware threading anddynamic voltageand frequency scaling(DVFS).

      Ah, so this is the key difference and the reason for the Smartwatt formula business