article_text
stringlengths
294
32.8k
topic
stringlengths
3
42
Resolved dependencies — PyTorch, which powers Tesla Autopilot and 150K other projects, will join the Linux Foundation. Enlarge / The PyTorch logo on a fancy gradient.PyTorch Foundation Meta AI announced today that the governance of PyTorch, a popular open source deep learning framework, has moved to an independent organization called the PyTorch Foundation. It will operate as part of the nonprofit Linux Foundation, and its governing board includes representatives from Nvidia, Meta, Google, Microsoft, Amazon, and AMD. The move comes as the adoption of the PyTorch framework has become widely used across the deep learning industry to power many natural language and computer vision projects, including Tesla Autopilot. Meta cites more than 2,400 contributors and 150,000 projects built on the framework. Spinning PyTorch off into its own foundation avoids potential conflicts of interest that might come if PyTorch were only controlled by Meta, which created the framework. "The PyTorch Foundation will strive to adhere to four principles," wrote Meta in a blog post announcing the news. "Remaining open, maintaining neutral branding, staying fair, and forging a strong technical identity. One of the foundation's main priorities is to maintain a clear separation between the business and technical governance of PyTorch." The PyTorch project began in 2016 as an implementation of the Torch library in Python, and it gained renown for tensor computation and its tape-based autograd, which records operations like a tape recorder and then plays them backward to compute gradients. If that sounds complicated, that's because it is—but PyTorch makes those functions easier by pulling them together into a building block of code that can be used in more complex machine learning projects without needing each new project to re-invent the fundamentals each time. Layers upon layers of libraries and frameworks make up most of today's modern software applications, and that's especially true in the machine learning field. According to Meta, the transition to the PyTorch Foundation will not affect any existing PyTorch code, and Meta remains committed to investing in the now-independent framework in the future.
Software Applications
4 Reasons Low-Code Tools Will Never Replace Software Developers While low-code tools have become a go-to solution for companies and businesses, they'll never replace developers. Here's why. Opinions expressed by Entrepreneur contributors are their own. Low-code tools are evolving as companies build applications to meet their needs. Its flexibility and scalability have become a go-to solution for companies and businesses. Companies can now create custom applications with ease and meet customers' demands. But it's logical to imagine that low-code tools will replace developers. However, low-code tools will never replace developers, especially those working with C++, Python and Java languages. Though low-code tools could replace handwritten codes, companies and businesses need developers to optimize the software and its applications. Even if low-code is for all developers, it's handy for high-code developers as it eases building applications faster. Ideally, low-code is a powerful software development tool designed to make a developer's life easier. Since most companies are in the early stages of transformation, there is so much that developers need to work on, and they are not getting replaced anytime soon. In this article, we'll discuss what exactly low-code tools are and why they'll likely never replace developers. What are low-code tools? Low-code tools are software applications assisting tech and businesses in elevating coding from textual to visual. It operates in a model-driven and drag-and-drop interface. Low-code tools build value-driven enterprise applications, making them suitable for all development skill levels. Although there is a rapid change in the digital era with businesses digitalizing their operations, companies in the early stages of machine learning and artificial intelligence are set to benefit from low-code tools. Plus, there are no specific code tools for various industries — meaning it's getting more complicated to design new programs without hiring a developer. Why low-code tools will never replace developers 1. High-level of flexibility: With a team of developers, you can easily add in-depth functionality to a solution and maintain it without worrying about outages. Sharing responsibilities and allowing professionals to connect and share their ideas is the best way for a business to grow. Besides, it becomes easier to implement the requested functionality with a dedicated team of developers. Low-code platforms cannot provide this flexibility, especially when creating complex software solutions. 2. Collaboration: The emergence of low-code tools doesn't mean everything built by then will get destroyed; its emergence is due to increased demand on the market. Generally, low-code tools came to make old coding methods fast, efficient and exciting to both developers and businesses. These tools push developers towards collaboration. They are forced to improve their communication skills, interact directly with clients, sharpen their skills and channel their skills to meet business needs. It brings together businesses, engineers and developers. It invites all developers to teamwork and closes the gap between departments. 3. In-demand low-code skills: Businesses always have issues to solve. This means that developers with low-code skills will remain in demand. Companies always have improvements they can make. Companies will not only need developers who can use low-code tools, but they may also need written code in areas where low-code does not solve complex issues. According to IDC, the global population annual growth rate of low-code developers is expected to be 40.4% in 2021-2025. This is an increase of 3.2 times the general developer population growth rate. 4. Avoid repetitive tasks: On average, developers spend lots of time dealing with technical debt. But the low-code platform handles loads of work, making it easy to introduce the debt. For instance, developers must refactor the code every time an operating system update is needed. Low-code platforms can handle such types of tasks. Also, it means developers will focus on inventing new code rather than repeating the same code multiple times. The odds favor the developers because they did not come to the industry to fix and maintain the old but to build new things. They will have enough time to focus on more complex software solutions and applications, eventually improving companies. With business competition increasing, there is an urgent need to develop new software solutions fast. But developing complex software is time-consuming. Writing code takes hundreds of hours and even more time to customize and improve efficiency. And since low-code platforms need minimal handwritten code, developing an app on low-code platforms will take a few days. Developers will now spend less time creating new codes and focus on developing responsive software that meets customer needs. This means businesses will now have sufficient time to predict customers' needs and develop new software based on that data. According to Gartner research, no-code applications will improve innovation and ensure the adoption of composable enterprise. This software application allows real-time adaptability and resilience in unsettled times. Also, no-code or low-code tools help developers to recompose packed and modular components, thus improving business capabilities and creating an adaptive custom application. With the ongoing advancement in the tech sector, there is still a software developer shortage. Low-code software and applications can support developers by helping them create applications and features fast. So, low-code tools will never replace developers. Developers must embrace low-code tools and see their career prospects thrive. They should explore low-code tools, build apps, learn how to use the tools and become more productive.
Software Applications
OpenAI plans major updates to lure developers with lower costs - The company plans to unveil vision capabilities that will enable developers to build applications to analyse images and describe them - The new features mark OpenAI’s ambition to expand beyond a consumer sensation into one also offering a hit developer platform OpenAI plans to introduce major updates for developers next month to make it cheaper and faster to build software applications based on its artificial intelligence (AI) models, as the ChatGPT maker tries to court more companies to use its technology, sources briefed on the plans told Reuters. The updates include the addition of memory storage to its developer tools for using AI models. This could theoretically slash costs for application makers by as much as 20-times, addressing a major concern for partners whose cost of using OpenAI’s powerful models could pile up quickly, as they try to build sustainable businesses by developing and selling AI software. The company also plans to unveil new tools such as vision capabilities that will enable developers to build applications with the ability to analyse images and describe them, with potential use cases in fields from entertainment to medicine. The new features mark the company’s ambition to expand beyond a consumer sensation into one also offering a hit developer platform, as its Chief Executive Sam Altman has envisioned. The company toiled in relative obscurity outside the tech industry as a non-profit co-founded by Elon Musk and Altman in 2015. Musk does not currently own a stake in the company. The new features are expected to be rolled out at OpenAI’s first-ever developer conference in San Francisco on November 6, sources said. They are designed to encourage companies to use OpenAI’s technology to build AI-powered chatbots and autonomous agents that can perform tasks without human intervention, said sources, who asked not to be named for discussing the company’s private plans. OpenAI declined to comment. The company burst onto the scene last November when it launched ChatGPT, enticing hundreds of millions of people to try out the chatbot that responded to questions and commands in humanlike ways, turning it into one of the world’s fastest growing consumer applications. More recently, the company has faced some challenges courting outsiders to build businesses using its technology. Making OpenAI indispensable to other companies building apps is among the most important strategic objectives for Altman. He has met with developers, expressing his desire to build a new ecosystem based on OpenAI’s models, which is now baked into myriad applications, from DoorDash to writing assistant Jasper. The planned release of the so-called stateful API (Application Program Interface) will make it cheaper for companies to create applications by remembering the conversation history of inquiries. This could dramatically reduce the amount of usage developers need to pay for. Currently, processing a one-page document using GPT-4 could cost 10 US cents, depending on the length and complexity of the input and output, according to pricing on OpenAI’s website. Another update, vision API, would allow people to build software that can analyse images, weeks after the feature became available for ChatGPT users. Giving developers this tool also marks an important step of OpenAI rolling out so-called multi-modal capabilities, which process and generate different types of media besides text, such as images, audio and video. These releases are designed to attract more developers to pay to access OpenAI’s model to build their own AI software for a variety of uses, such as writing assistants or customer service bots. Investors have poured over US$20 billion this year into AI start-ups, many of which rely on OpenAI or another foundation model company’s technology, according to PitchBook data. But investors are worried about these start-ups’ reliance on companies like OpenAI or Google because this could make them vulnerable to being replicated by rivals or by the bigger companies themselves through product updates. Meanwhile, start-ups are also trying to diversify the types of models they use, experimenting with OpenAI competitors and open-source options such as Meta’s Llama. That makes it important for OpenAI to distinguish itself from deep-pocketed rivals like Google. Keeping developers happy has been a major focus for OpenAI, these sources told Reuters. While ChatGPT has been wildly successful among consumers, OpenAI’s ambition to win over other companies has been less smooth. Earlier this year, the company rushed to release ChatGPT plug-ins, add-on tools that allow developers to create applications within ChatGPT. OpenAI hoped that plug-ins would be its equivalent of Apple’s iOS App Store, gaining an advantage over rival chatbots like Google’s Bard. Developers whose plug-ins are in the top 30 or so “popular” category described an initial burst of hype, followed by a steep drop-off in interest. The popular Scholar AI plug in had about 7,000 users a day as of late August, estimated its developer Lakshya Bakshi. ChatGPT attracts about 180 million monthly active users. Altman has publicly acknowledged there is more work to do. Earlier this year, he admitted to a group of developers in London that plug-ins have not gained market traction.
Software Applications
Unleashing the Cloud’s Potential: Building Bridges with our Remote Cloud Native Engineers Team Cloud Technology has transformed the way we work and collaborate. It has opened up a world of possibilities, enabling seamless connectivity and remote collaboration like never before. With our Remote in-house cloud-native engineering team, businesses tap into a vast pool of talent from across the globe, building bridges remotely and harnessing the power of the cloud for their projects. Let's explore how our engineers are transforming the way of work and the positive impact they have on remote in-house Engineering. Diverse Perspectives: Remote in-house Engineers bring a diverse range of experiences and perspectives to the table. Remote teams located in different time zones, cultures, and backgrounds. It leads to innovative solutions and unique approaches to problem-solving. It promotes creativity and fosters an environment of learning and growth, enriching the overall team dynamic. You're right! They work remotely from diverse backgrounds and contribute valuable perspectives to the table. They come from various countries and cultures. It brings a wealth of different experiences, traditions, and ways of thinking. It leads to a wide range of ideas for problem-solving, as individuals may have different perspectives shaped by their cultural backgrounds. They are from different time zones, which is an advantage for a team. It enables round-the-clock support and development since someone is working during different hours. This flexibility results in faster response times, improved efficiency, and continuous progress. They encompass a wide range of skills and technologies. They have diverse technical backgrounds and expertise, acquired through different educational paths, work experiences, or self-learning. It brings innovative solutions and fresh insights to problem-solving, fostering continuous learning within the team. Remote work setups typically require effective communication and collaboration tools and practices. When cloud native engineers from diverse backgrounds collaborate remotely, they learn from each other's unique perspectives and experiences. This cross-pollination of ideas sparks creativity and leads to innovative solutions that may not have been possible in a homogeneous environment. As cloud technologies continue to transcend geographical boundaries, it is essential to have a diverse team that understands different markets and customer needs. They brings firsthand knowledge and insights into various markets, helping teams tailor their solutions to meet diverse customer requirements. Knowledge Sharing: Remote in-house Engineering teams consisting of cloud-native engineers leverage the power of the cloud to facilitate seamless knowledge sharing. Cloud-based platforms allow team members to document and share their expertise easily, making it readily available to others. It enables continuous learning and ensures valuable knowledge and insights within the confines of a single individual or location. They have the advantage of harnessing the capabilities of cloud computing to enhance knowledge sharing. Those cloud platforms provide numerous benefits that promote seamless collaboration and the preservation of valuable knowledge within a team. It offers tools and services for creating and maintaining centralized knowledge repositories. Team members document their expertise, best practices, and solutions in easily accessible formats such as wikis, document management systems, or knowledge bases. These repositories serve as a centralized source of information that is searched, updated, and expanded upon by team members. Collaboration tools enable them to work together on documents, code, and projects. It facilitates collaborative problem-solving and knowledge exchange. Multiple team members can contribute to a document or codebase, providing instant feedback and sharing insights as they work. Video conferencing tools allow them for face-to-face discussions, sharing screens, and collaborate. These platforms enable remote teams to conduct meetings, brainstorming sessions, and knowledge-sharing sessions regardless of geographical location, fostering a sense of connection and collaboration. It control systems and continuous integration tools enable teams to manage code repositories. These systems keep track of changes made by different team members, allowing for collaboration on codebases without conflicts. They also enable automated testing and integration, ensuring that knowledge and code changes are constantly validated and integrated into the project. They leverage cloud platforms' monitoring and logging capabilities to gather valuable insights about the system's behavior, performance, and issues. By collecting and analyzing logs, they identify patterns and diagnose problems. Cloud-based video conferencing and virtual meeting tools allow them to engage in face-to-face discussions, share screens, and collaborate team. It enables knowledge transfer and helps to improve the overall performance and reliability of cloud-native applications. Chat and other messaging platforms provide communication channels for them. These tools enable quick exchanges of ideas, questions, and knowledge-sharing in an informal setting. Channels dedicated to specific topics or areas of expertise allow team members to share insights, troubleshoot issues, and exchange valuable resources effortlessly. It provides resources for training and continuous learning. Online courses, tutorials, and documentation expands their knowledge and acquire new skills. These platforms offer certifications and communities where engineers engage with experts and peers to share insights and discuss cloud-native practices. Continuous Integration and Delivery: Remote in-house engineers adopt continuous integration and delivery (CI/CD) to streamline the software development process. CI/CD enables rapid and frequent releases and ensures new features and updates to users on time. With cloud-based infrastructure and automation tools, remote teams can implement efficient CI/CD pipelines, reducing the time to market and improving overall software quality. That's correct! Continuous Integration and Delivery (CI/CD) has become a fundamental practice in modern software development. The Automation process involves building, testing, and deploying software applications. It allows rapid and frequent releases. CI/CD involves the integration of code changes from multiple developers into a shared repository. This process helps identify any integration issues early on and ensures that the codebase remains stable. The automated test checks the functionality and quality of the software. If any tests fail, the CI/CD system provides feedback to the development team, allowing them to address the issues promptly. If the code passes, it is deployed to different environments. It goes through development, staging, and production, using the continuous delivery aspect of CI/CD. The deployment process can be automated, eliminating manual steps and reducing the risk of errors. It enables software updates and new features for users. They leverage cloud-based infrastructure and automation tools to implement efficient CI/CD pipelines. It provides easy deployment and makes it easier to set up and manage CI/CD systems. Various tools are available, such as Jenkins, Travis CI, CircleCI, and GitLab CI/CD, which enable teams to automate the CI/CD process. By embracing CI/CD, development teams can reduce time to market, increase productivity, and improve software quality. It promotes collaboration, helps catch bugs early, and allows rapid iteration and experimentation. Overall, CI/CD is a crucial practice for delivering high-quality software in a fast-paced and agile manner. Innovation and Experimentation: Remote teams consisting of cloud-native engineers experiment and innovate without the constraints of physical boundaries. They can leverage cloud resources to quickly prototype new ideas, test hypotheses, and explore cutting-edge technologies. This culture of innovation nurtures a dynamic and forward-thinking environment, propelling businesses to stay ahead of the competition. Remote teams allow businesses to tap into a global talent pool. This diversity stimulates creativity and simplifies collaboration among team members, leading to fresh ideas and innovative solutions. Cloud resources provide on-demand scalability and flexibility, enabling remote teams to quickly create prototypes and minimum viable products (MVPs) without extensive physical infrastructure. They spin up virtual environments, deploy applications, and experiment with different configurations easily, reducing time to market for new ideas. Cloud services offer a pay-as-you-go model, eliminating the upfront investments in hardware and infrastructure. This cost-effective approach allows them to experiment with multiple ideas without being constrained by budgetary limitations. Cloud providers continuously introduce new tools, services, and frameworks. They help them to stay at the forefront of technological advancements. Cloud platforms provide automatic scalability, allowing remote teams to handle fluctuations in demand effortlessly. They conduct experiments and scale up or down resources based on the results, optimizing performance and cost efficiency. Cloud services offer built-in redundancy and disaster recovery mechanisms, ensuring business continuity during unforeseen events. It involves having access to vast amounts of data. This access allows teams to analyze user behavior, system performance, and other relevant metrics. With this data, teams can make decisions and gain insights that drive improvement in their products and services. Remote cloud native engineers understand developing technologies, like machine learning, artificial intelligence, the Internet of Things (IoT), and serverless computing. This exposure to emerging trends keeps them at the forefront of technological advancements, allowing them to leverage the latest tools and frameworks to drive innovation. The combination of flexibility, rapid prototyping, cost efficiency, access to cutting-edge technologies, scalability, resilience, and data-driven decision-making sets remote cloud native engineers up for success. They are well-positioned to adopt innovation, change in market demands, and deliver impactful solutions that drive businesses forward.
Software Applications
New Nitrogen malware pushed via Google Ads for ransomware attacks - July 26, 2023 - 11:04 AM A new 'Nitrogen' initial access malware campaign uses Google and Bing search ads to promote fake software sites that infect unsuspecting users with Cobalt Strike and ransomware payloads. The goal of the Nitrogen malware is to provide the threat actors initial access to corporate networks, allowing them to conduct data-theft, cyberespionage, and ultimately deploying the BlackCat/ALPHV ransomware. Today, Sophos released a report on the Nitrogen campaign, detailing how it primarily targets technology and non-profit organizations in North America, impersonating popular software like AnyDesk, Cisco AnyConnect VPN, TreeSize Free, and WinSCP. Trend Micro was the first to document this activity at the start of the month, seeing WinSCP ads leading to BlackCat/ALPHV ransomware infections on a victim's network. However, that report focused on the post-infection stage and lacked extensive IoCs (Indicators of Compromise) due to being based on a single response incident. The Nitrogen malware campaign The Nitrogen malware campaign starts with a person performing a Google or Bing search for various popular software applications. The software seen as lures for the Nitrogen malware campaign includes: - AnyDesk (remote desktop application) - WinSCP (SFTP/FTP client for Windows) - Cisco AnyConnect (VPN suite) - TreeSize Free (disk-space calculator and manager) Depending on the targeting criteria, the search engine will display an advertisement that promotes the searched-for software. Clicking the link brings the visitor to compromised WordPress hosting pages that imitate the legitimate software download sites for the particular application. Only visitors of specific geographic regions are redirected to the phishing sites, while direct hits onto the malicious URLs trigger a rick-rolling redirection to YouTube videos instead. From those fake sites, users download trojanized ISO installers ("install.exe"), which contain and sideload a malicious DLL file ("msi.dll"). The msi.dll is the installer for the Nitrogen initial access malware called internally "NitrogenInstaller," which further installs the promised app to avoid suspicion and a malicious Python package. The NitrogenInstaller also creates a registry run key named "Python" for persistence, pointing to a malicious binary ("pythonw.exe") that runs every five minutes. The Python component will execute "NitrogenStager" ("python.311.dll"), which is responsible for establishing communication with the threat actor's C2 and launching a Meterpreter shell and Cobalt Strike Beacons onto the victim's system. In some cases observed by Sophos analysts, the attackers moved to hands-on activity once the Meterpreter script was executed on the target system, executing manual commands to retrieve additional ZIP files and Python 3 environments. The latter is needed for executing Cobalt Strike in memory, as the NitrogenStager cannot run Python scripts. Sophos says due to successfully detecting and stopping the observed Nitrogen attacks, it has not determined the threat actor's goal, but the infection chain point to staging the compromised systems for ransomware deployment. However, Trend Micro had previously reported that this attack chain led to the deployment of the BlackCat ransomware in at least one case. This campaign is not the first time ransomware gangs abused search engine advertisements to gain initial access to corporate networks, with both the Royal and Clop ransomware operations using this tactic in the past. Users are recommended to avoid clicking on "promoted" results in search engines when downloading software and instead only downloading from the developer's official site. Also, be wary of any downloads utilizing ISO files for software, as that is an uncommon method to distribute legitimate Windows software, which usually comes as an .exe or .zip archive.
Software Applications
This just in: smart appliances are still not a bright idea for those who care about privacy. The latest word on the subject comes from Stephan van Rooij, a software architect with Smartersoft BV in the Netherlands and a Microsoft MVP in security. Van Rooij is the owner of two AEG smart appliances – the AEG Built In Combination Microwave (KMK768080B) and the AEG Oven (BSK798280B). As he noted in a write-up this week, these appliances weren't purchased for their connectivity – the fact that they had Wi-Fi was only discovered after they'd been acquired. Internet-connected devices, van Rooij explained, often check to see if Wi-Fi is available, so they can phone home and do whatever it is they need to do. Companies like Apple, Google, and Microsoft have dedicated endpoints to receive network availability checks. It checks three public websites every 5 minutes Van Rooij argued other manufacturers should follow this example and set up their own endpoints so they're not relying on an external site that may be unexpectedly unavailable. Nonetheless some suppliers looking to verify wireless network connectivity simply query popular public websites, figuring they'll probably be available. According to van Rooij, that's what Electrolux-owned AEG has done. "AEG chose the easy route, and checks three public websites every five minutes when connected to your Wi-Fi," he said, noting that its smart ovens ping google.com, baidu.cn, and yandex.ru. Google.com is widely recognized. People in the US and Europe may be less familiar with Baidu.cn, a popular search engine in China, and Yandex.ru, a widely used search engine in Russia. (Incidentally, Yandex had its source code allegedly stolen by a former employee and leaked online as a 45GB archive.) "I really don't like the fact that my oven connects to China and Russia just to check if it has an internet connection," said van Rooij. "If that is the only thing it’s doing." Why do Smart TV UIs suck? President Biden still wants his cybersecurity labels on those smart devices Reg reader returns Samsung TV after finding giant ads splattered everywhere Smart fridges are cool, but after a few short years you could be stuck with a big frosty brick in the kitchen This sort of network activity, contacting servers in other countries, is commonplace among smart appliances, not to mention software applications and many of their incorporated SDKs. As noted in a 2019 research paper [PDF] on the topic, "Information Exposure From Consumer IoT Devices," 72 of 81 devices examined were found to send data to third parties. There's nothing necessarily nefarious about network availability pings, but given the abundance of IoT security vulnerabilities and the needless emission of IP address data to search firms in China and Russia, concern may be warranted. Van Rooij noticed the network traffic because he uses Pi-hole software to do DNS-based ad filtering. And others who have implemented similar network filtering report being similarly surprised by the chattiness of their kit. The Register asked the US spokesperson for Sweden-based Electrolux to comment and we've not heard back. When we spoke to van Rooij, he said that he had just heard back from the manufacturer's press department on Thursday morning, which he had messaged after failing to get a response from customer support. "I couldn't get anyone to talk to me," he said. "Now they're talking." Van Rooij said he was particularly concerned about undisclosed connections to China and Russia and argued that a connectivity check could be done through the oven's existing undocumented API, which is used to control it remotely – a separate security risk – using a mobile app. "My suggestion is the oven already has an API in the cloud that should be used to check connectivity," he said. Asked what he'd like to see happen with these sorts of appliances, Van Rooij referred to his blog post remarks calling for local control over Wi-Fi rather and for making any cloud connection optional. "I think that companies developing appliances that want to "smartify" should first consider having local control on the current Wi-Fi network, and then make the cloud optional," he said. "You don't buy a device for a year – they last five to 10 years. I'm worried that people may rely on the cloud functionality and these companies don't have the incentive to keep the cloud running for years." ®
Software Applications
A new piece of research has detailed the increasingly sophisticated nature of the malware toolset employed by an advanced persistent threat (APT) group named Earth Aughisky. "Over the last decade, the group has continued to make adjustments in the tools and malware deployments on specific targets located in Taiwan and, more recently, Japan," Trend Micro disclosed in a technical profile last week. Earth Aughisky, also known as Taidoor, is a cyber espionage group that's known for its ability to abuse legitimate accounts, software, applications, and other weaknesses in the network design and infrastructure for its own ends. While the Chinese threat actor has been known to primarily target organizations in Taiwan, victimology patterns observed towards late 2017 indicate an expansion to Japan. The most commonly targeted industry verticals include government, telcom, manufacturing, heavy, technology, transportation, and healthcare. Attack chains mounted by the group typically leverage spear-phishing as a method of entry, using it to deploy next-stage backdoors. Chief among its tools is a remote access trojan called Taidoor (aka Roudan). The group has also been linked to a variety of malware families, such as GrubbyRAT, K4RAT, LuckDLL, Serkdes, Taikite, and Taleret, as part of its attempts to consistently update its arsenal to evade security software. Some of the other notable backdoors employed by Earth Aughisky over the years are as follows - SiyBot, a basic backdoor that uses public services like Gubb and 30 Boxes for command-and-control (C2) TWTRAT, which abuses Twitter's direct message feature for C2 DropNetClient (aka Buxzop), which leverages the Dropbox API for C2 Trend Micro's attribution of the malware strains to the threat actor is based on the similarities in source code, domains, and naming conventions, with the analysis also uncovering functional overlaps between them. The cybersecurity firm also linked the activities of Earth Aughisky to another APT actor codenamed by Airbus as Pitty Tiger (aka APT24) based on the use of the same dropper in various attacks that transpired between April and August 2014. 2017, the year when the group set its sights on Japan and Southeast Asia, has also been an inflection point in the way the volume of the attacks has exhibited a significant decline since then. Despite the longevity of the threat actor, the recent shift in targets and activities likely suggests a change in strategic objectives or that the group is actively revamping its malware and infrastructure. "Groups like Earth Aughisky have sufficient resources at their disposal that allow them the flexibility to match their arsenal for long-term implementations of cyber espionage," Trend Micro researcher CH Lei said. "Organizations should consider this observed downtime from this group's attacks as a period for preparation and vigilance for when it becomes active again." Found this article interesting? Follow THN on Facebook, Twitter  and LinkedIn to read more exclusive content we post.
Software Applications
WASHINGTON (AP) — The Justice Department said Wednesday that three Iranian citizens have been charged in the United States with ransomware attacks that targeted power companies, local governments and small businesses and nonprofits, including a domestic violence shelter. READ MORE: FBI Director Christopher Wray says agency blocked planned cyberattack on children’s hospital The charges accuse the hacking suspects of targeting hundreds of entities in the U.S. and around the world, encrypting and stealing data from victim networks, and threatening to release it publicly or leave it encrypted unless exorbitant ransom payments were made. In some cases, the victims made those payments, the department said. The Biden administration has tried to go after hackers who have held U.S. targets essentially hostage, often sanctioned or sheltered by adversaries. A Russia-based hacker group was accused of conducting a ransomware attack last year on Georgia-based Colonial Pipeline, which disrupted gas supplies along the East Coast. These hackers are not believed to have been working on behalf of the Iranian government but instead for their own financial gain, and some of the victims were even in Iran, according to a senior Justice Department official who briefed reporters on the case on the condition of anonymity under ground rules set by the department. The official said the activity, even if not directed by the Iranian government, exists because the regime permits hackers to largely operate with impunity. In a related action Wednesday, the Treasury Department’s Office of Foreign Assets Control sanctioned 10 individuals and two entities affiliated with Iran’s Islamic Revolutionary Guard Corps who it says have been involved in malicious cyber activities, including ransomware. The announcements come amid an apparent stalemate in talks between the U.S. and Iran over the possible revival of a 2015 nuclear deal. Israel and some U.S. lawmakers of both parties are pushing the Biden administration to get tougher on Iran, calling the negotiations on Iran’s nuclear program a failure. The three accused hackers are thought to be in Iran and have not been arrested, but the Justice Department official said the pending charges make it “functionally impossible” for them to leave the country. The case was filed in federal court in New Jersey, where a municipality and an accounting firm were among the victims. The alleged hacking took place between October 2020 through last month, when the indictment was issued under seal. The three defendants — identified as Mansour Ahmad, Ahmad Khatibi Aghda and Amir Hossein Nickaein Ravari — are accused of exploiting known or publicly disclosed vulnerabilities in software applications to break into the victims’ computer networks. Prosecutors say the targets were seen by the defendants as victims of opportunities or entities that would likely be willing to pay money to get their data back. The victims included a domestic violence shelter in Pennsylvania, which the indictment says was extorted out of $13,000 to recover its hacked data; electric utilities in Indiana and Mississippi; and a county government in Wyoming. Associated Press writers Fatima Hussein and Ellen Knickmeyer in Washington and Frank Bajak in Boston contributed to this report.
Software Applications
AppHub, an e-commerce enablement platform serving merchants across e-commerce platforms, such as Shopify, WooCommerce, and BigCommerce, today announced a $95 million strategic growth investment from PSG, a leading growth equity firm partnering with software and technology-enabled service companies to help accelerate their growth. PSG joins existing investor Silversmith Capital Partners (“Silversmith”), which will continue its partnership with AppHub. In conjunction with the investment, AppHub acquired Boost, an AI-powered search and discovery tool, to enhance its portfolio of software solutions that enable e-commerce merchants to launch, grow, and scale their businesses. Through AI-driven search, filtering, and product recommendations, Boost helps customers find what they’re looking for, increasing conversion and AOV (average order value). Boost has generated more than $16 billion in sales for more than 14,000 stores to date. Boost is the ninth founder-led e-commerce application to join the AppHub suite since its founding in 2021 in collaboration with Silversmith. Before Boost, AppHub completed its acquisition of REVIEWS.io, a leading online review and UCG management platform. AppHub aims to use the new investment to further build out its leading platform for capturing new customers, increasing conversion, and helping customers create authentic reviews and user-generated content for its more than 100k merchants it serves across more than 25 applications, as well as continue to fuel its M&A strategy. The company has increased the number of merchants on its platform 20xsince its initial formation. “We have been on a mission to advance the future of e-commerce through software that enables merchant success. With the support of Silversmith and now PSG, and by bringing Boost and its incredible AI-powered technology into the AppHub fold, we believe we will be even better positioned to help e-commerce businesses accelerate their growth in an increasingly complex environment,” said Kris Eng, co-founder and CEO of AppHub. “We are excited to continue our relationship with Silversmith and welcome PSG as a strategic growth partner.” “The AppHub team has built what we believe is a differentiated and powerful software platform aiming to position merchants for success throughout different phases of their growth journey,” said Matt Stone, Managing Director at PSG. “We look forward to our partnership with AppHub, alongside the Silversmith team, and share in their enthusiasm for the company. We see a significant opportunity for AppHub to scale its portfolio of mission-critical software and deliver long-term growth.” “AppHub’s continued growth is a testament to the team’s dedication to product innovation and merchant success,” said Sri Rao, General Partner at Silversmith. “We are encouraged by the progress to date since helping to launch AppHub in 2021 in partnership with its founders and look forward to partnering with the PSG team as the company enters an exciting next phase of growth.” About AppHub Based in New York City and San Francisco, AppHub is a software platform for e-commerce merchants that brings together best-in-class software applications to drive merchant growth. AppHub’s product suite consists of more than 25 point solutions used by over 100,000 merchants on platforms such as Shopify, WooCommerce, and BigCommerce. AppHub’s mission is to create software that advances the future of commerce. For more information, please visit www.apphub.com. About Boost Founded in 2017, Boost works with merchants to power and optimize product discovery & navigation. To date, Boost’s app has helped over 14,000 customers build a seamless shopping experience and gain more sales using an AI-fueled search, merchandising, product filter, recommendations, and analytics platform. About PSG Equity PSG Equity (“PSG”) is a growth equity firm that partners with software and technology-enabled services companies to help them navigate transformational growth, capitalize on strategic opportunities and build strong teams. Having backed more than 120 companies and facilitated over 450 add-on acquisitions, PSG brings extensive investment experience, deep expertise in software and technology, and a firm commitment to collaborating with management teams. Founded in 2014, PSG operates out of offices in Boston, Kansas City, London, Paris, Madrid and Tel Aviv. To learn more about PSG, visit www.psgequity.com. About Silversmith Capital Partners Founded in 2015, Silversmith Capital Partners is a Boston-based growth equity firm with $3.3 billion of capital under management. Silversmith’s mission is to partner with and support the best entrepreneurs in growing, profitable technology and healthcare companies. Representative investments include ActiveCampaign, Appfire, Apryse, DistroKid, impact.com, Iodine Software, LifeStance Health, and Webflow. For more information, including a full list of portfolio investments, visit www.silversmith.com or follow the firm on LinkedIn.
Software Applications
In late 2021, a critical vulnerability was discovered within the Apache-Log4j logging tool. This Log4j tool and vulnerability became infamous because it was used by millions of software packages across organizations that had no idea it existed within their software supply chain. Even organizations that develop their own software often leverage third-party commercial and open-source software to support their business services. Software supply chain risk has emerged as a leading concern for private sector firms and government agencies of all sizes. There is even a legislative effort within the Senate Homeland Security and Governmental Affairs Committee to help secure open-source software. Unpacking this supply chain, and finding methods to estimate and reduce the risk, is a massive problem for a number of reasons. First, the number of open-source packages and libraries is tremendous. Github, an online platform that manages software for others, hosts over 200 million software repositories. And each programming language uses its own system for tracking software across its ecosystems. Javascript and Python, two very popular programming languages, support over a million packages combined. Second, very little is known about the extent to which organizations employ these packages. There is no authoritative directory describing which companies use which software components. In fact, companies, themselves, may not even know the breadth of software they use for their critical business operations. One research collaboration between Harvard University and the Open Source Software Foundation has begun surveying companies in order to estimate the prevalence of software use across firms, but so far this only provides a tiny account of actual software in use by companies within the United States. Third, the tools for analyzing this risk have yet to be built. Software bills of material (SBOMs) serve as an ingredient list for software applications. SBOMs are becoming increasingly popular and have even been mandated through a Presidential Executive Order. The intention is that an SBOM will enumerate all of the software components required for a given package to function, thereby helping users identify and manage their software risks. However, the actual practice of creating and disclosing them is still evolving. For example, it is unclear how many layers deep an SBOM should expose a software supply chain. Some packages (like Log4j) may have thousands upon thousands of dependencies, and it is unclear whether this much detail is useful or even necessary. But there may be hope for better understanding this risk. First, the data exist to document and map out this extensive network. They are incomplete, and aren’t easy to find, but they do exist. Libraries.io and deps.dev are two community efforts that offer dependency data across multiple programming languages, from which network maps and network analysis can be created and analyzed. Similarly, the package managers of some software languages provide information that could also be used to map out their software ecosystem. Together, these data could fill a massive gap in our understanding of software dependency. And using standard network analysis techniques, those software components that are most critical to the ecosystems could begin to be identified. Second, as the practice of creating and using SBOMs becomes more mature, users may become more adept at ingesting the information, comparing SBOMs across applications, and identifying the most risky components. For example, one approach to using SBOMs to visualize risk might be to sort through all the software packages listed in a given SBOM, and collect the known vulnerabilities from each, information that is readily available from the National Institute of Standards and Technology. Each vulnerability could then be plotted according to its impact, using the Common Vulnerability Scoring System standard, and its exploitability, using the Exploit Prediction Scoring System standard, on a graph that allows risk to be more easily visualized. From there, organizations could visually inspect, compare, and develop strategies for mitigating the risk of one or more software applications. Software supply chain security has emerged as a leading risk because of the massively fragmented and decentralized nature of modern software development. Unlike other problems in cybersecurity, this is a discrete problem, where the data exist. Information required to map software dependents or dependencies is knowable because there exists a finite limit to the number of nodes and dependencies. And so, while we still have much to learn as a community about this risk, there are concrete steps we can take to better understand and mitigate the risk. Sasha Romanosky is a senior policy researcher at the nonprofit, nonpartisan RAND Corporation, an appointed member of the Data Privacy and Integrity Advisory Committee at the Department of Homeland Security, and a former cyber policy advisor at the Pentagon in the Office of the Secretary of Defense for Policy.
Software Applications
We can all agree on one thing: artificial intelligence is currently one of the most critical technologies in the world. But, people are still determining how the future of labor will be affected by AI. Artificial intelligence is already affecting our day-to-day lives, influencing everything from search results to the odds of meeting someone special online to how we choose to purchase. Artificial intelligence (AI) has been making waves in recent years, and its impact on the future of employment cannot be ignored. AI is changing how we work, the jobs we do, and the skills needed to do them. While some argue that AI will lead to mass unemployment, others believe it will create new jobs and enhance existing ones. AI is impacting the future of employment in many ways. On the one hand, AI has the potential to automate many tasks and replace human workers, leading to job displacement and social disruption. On the other hand, AI is also creating new jobs and enhancing existing ones. Here are some of the key ways that AI is impacting the future of employment: Automating repetitive and routine tasks AI can automate many routine and repetitive tasks, such as data entry, administrative work, and manufacturing. This can lead to increased efficiency, cost savings, and productivity, but it can also lead to job displacement for workers who perform these tasks. Creating new jobs AI also creates new jobs in areas such as AI development, maintenance, programming, and management. These jobs require specialized skills and knowledge, which can be highly rewarding and well-paying. Enhancing existing jobs AI also enhances existing jobs by providing workers with tools and insights that make their work more efficient and effective. For example, salespeople can use AI to identify patterns in customer behavior and tailor their sales pitch accordingly. Improving accuracy and precision AI can analyze data and provide more accurate and precise insights than those provided by humans. This can lead to better decision-making and improved outcomes in many industries. Reducing human error AI can also reduce human error and improve safety in industries such as healthcare and transportation. For example, AI can help doctors to diagnose diseases more accurately and help self-driving cars to avoid accidents. However, there are also some potential drawbacks to using AI in employment. These include job displacement, bias and discrimination in AI systems, lack of transparency in decision-making, and overreliance on AI. Addressing these potential risks and finding ways to mitigate them is essential as we move towards a future where AI is increasingly prevalent in the workplace. Benefits of AI in Employment AI has several benefits in employment. One of the most significant benefits is increased efficiency. AI can automate repetitive and time-consuming tasks, allowing employees to focus on more complex and creative work. For example, AI can scan resumes and cover letters, freeing up HR professionals' time to focus on interviewing and hiring. AI can also help to reduce costs. With AI, companies can automate their processes, reducing the need for manual labor. This can lead to significant cost savings, which can be reinvested in other business areas. Additionally, AI can help to improve product quality and increase customer satisfaction. For example, AI can analyze customer data to identify patterns and improve products or services. Increased efficiency and productivity AI can automate routine and repetitive tasks, allowing workers to focus on more complex and creative work. This can increase efficiency and productivity, leading to better business outcomes. Enhanced decision-making AI can analyze large amounts of data and provide insights that humans may not be able to discern independently. This can help decision-makers make more informed and accurate decisions. Improved safety and quality AI can help to improve safety and quality in industries such as healthcare and transportation. For example, AI can help doctors to identify potential health risks earlier and help self-driving cars to avoid accidents. Creation of new jobs While some jobs may be displaced by automation, AI also has the potential to create new jobs in areas such as AI development, programming, and management. Personalization AI can help create personalized customer experiences by analyzing their behavior and preferences. This can lead to higher levels of customer satisfaction and loyalty. Cost savings AI can help businesses to reduce costs by automating routine tasks and improving efficiency. This can help to improve the bottom line and make businesses more competitive. Overall, the benefits of AI in the future of employment are significant and can help businesses to be more productive, efficient, and innovative. However, it is essential to ensure that the deployment of AI is done in a fair and equitable way and that the potential risks are appropriately managed. Importance of AI in Employment The importance of AI in employment lies in its ability to create new opportunities and enhance existing ones. While AI may replace some jobs, new jobs will be created in their place. For example, while AI may replace some jobs in manufacturing, it will create new jobs in the development, maintenance, and programming of AI systems. Additionally, AI can enhance existing jobs by providing employees with tools and insights to make their work more efficient and effective. For example, a salesperson can use AI to identify patterns in customer behavior and tailor their sales pitch accordingly, resulting in better sales performance. Pros and cons of AI in Employment Pros of AI in Employment - Increased efficiency: AI can automate repetitive tasks, allowing employees to focus on more complex and creative work. - Improved accuracy: AI can analyze data and provide more accurate and precise insights than humans. - Cost savings: AI can help to reduce costs by automating processes and reducing the need for manual labor. - Enhanced productivity: AI can provide employees with tools and insights to make their work more efficient and effective. Cons of AI in Employment - Job displacement: AI may replace some jobs, leading to unemployment and social disruption. - Bias and discrimination: AI systems can be biased, leading to discrimination against certain groups of people. - Lack of transparency: AI systems can be complex and challenging to understand, leading to a lack of transparency in decision-making. - Over-Reliance on AI: Over-Reliance can lead to a lack of human judgment and critical thinking. Examples of AI in Employment AI In The Office To Do Administrative Tasks Today, artificial intelligence applications for the workplace are gaining traction in various digital domains. The impact of artificial intelligence and robots on the workplace in terms of automating administrative duties. Hence, AI applications will support administrators in digitally managing the company's data and enable staff to focus more on creative, revenue-generating, and productive duties. AI Creates Interactive Platform With Workers That is one of the primary advantages of Artificial Intelligence in workplace administration. Managing human resources is the most crucial role in the workplace. Improved workforce management will aid businesses in increasing staff retention and reaching their objectives more efficiently. Applications and systems powered by AI will assist the company in optimizing employee experiences and increasing staff retention. Collaborative AI-based HRMS (Human Resources Management System) will produce an interactive platform that enables the organization and its employees to connect electronically and enhance their experiences. AI at Work Automates the Hiring Process This is one of the most significant advantages of artificial intelligence in the workplace. AI software applications automate the candidate selection, screening, and recruitment process using ML and deep learning technology. Apps powered by artificial intelligence for Android and iPhone can assess the qualifications of candidates, validate their expertise, and identify the best candidate for a given post. AI Enhances Workplace Productivity Every business requires a collection of data sets. Using Machine Learning algorithms, gathered data can be analyzed and processed precisely and efficiently. Businesses can derive results-driven insights from operational and functional data and develop strategies for enhancing productivity. AI applications generate valuable data-based predictions and increase productivity. Enhanced Customer Service It is one of the best examples of AI in workplace management. Chatbot applications powered by AI enable businesses to be accessible 24 hours a day, seven days a week and provide superior customer service at all times. Intelligent and interactive AI chatbots or virtual assistants can comprehend user commands and provide instantaneous, accurate responses. AI applications can provide superior service to human employees. With these Chatbots' assistance, businesses can respond rapidly to customer issues in less time. AI Business Applications Expand Sales Incorporating AI into the workplace improves sales and marketing functions. The most effective AI software solutions for businesses can perform and manage tedious tasks like identifying potential leads and converting leads into customers. Using AI tools, sales representatives can track and continuously monitor the lead pipeline from any location. CRM (customer relationship management) applications are the best workplace examples of artificial intelligence. A CRM solution powered by artificial intelligence enables salespeople to digitally retain client contact information, find sales prospects, better manage orders and see the delivery status, intelligently execute brand promotional activities, etc. Enhancing Collaboration It is one of the most effective applications of AI in workplace management. Based on the input provided by the software solutions, AI systems and tools will play a crucial part in the automatic scheduling and invitation to meetings of employees. On the other hand, artificial intelligence (AI) apps for workplace management enable project managers to collaborate with distributed teams and ensure quality output. Supporting Innovation It is one of the most noteworthy examples of artificial intelligence in the workplace or corporate management. AI technology in the workplace creates favorable conditions for product and service innovation. As AI solutions take over routine activities, employees have more time to interact with consumers and receive their valuable feedback. Conclusion AI is impacting the future of employment in significant ways. While it may lead to job displacement, it will also create new opportunities and enhance existing ones. The benefits of AI in employment include increased efficiency, improved accuracy, cost savings, and enhanced productivity. However, the cons of AI include job displacement, bias and discrimination, lack of transparency, and overreliance on AI. As AI continues to evolve and become more widespread, it is essential to balance its benefits and drawbacks to ensure a sustainable and equitable future for all. One critical challenge in managing AI's impact on employment is ensuring that workers have the necessary skills to adapt to new roles and opportunities. This requires investment in education and training programs that equip workers with the skills needed to thrive in a world where AI is increasingly prevalent. Another challenge is addressing the potential for bias and discrimination in AI systems. AI is only as unbiased as the data it is trained on, and if that data reflects biases and prejudices, those biases and prejudices can be amplified in the AI system. This requires careful consideration and oversight to ensure that AI systems are designed and deployed relatively and equitably. AI is transforming the world of work in numerous ways, and its impact will only continue to grow in the years to come. While there are both benefits and drawbacks to the use of AI in employment, it is clear that its impact will be significant. By being mindful of the potential risks and taking steps to mitigate them, we can harness the power of AI to create a more productive, efficient, and equitable future for all. Quick judgments are necessary for swift action in today's highly competitive corporate environment. As the human brain absorbs images faster than it processes text, SGA's business intelligence services strive to help business professionals quickly comprehend the hidden meaning behind and between data points by utilizing compelling and interactive visualizations. Our operational, real-time, and strategic dashboards are of the highest quality, allowing our clients to derive business plans from the provided insights. Using our subject experience, we have the personnel and technology to collect data from many sources, do exploratory analysis, and present insights through interactive and exciting narratives.
Software Applications
While music producers typically have a favorite digital audio workstation (DAW), most creators are forced to swap between software suites to better suit different workflows and plugin options. This is easier said than done. You have to download the audio stems from one DAW and upload them to the next, and that’s just the first step. There’s more to worry about, like mix levels, plugin settings, virtual instruments and so much more. Industry veterans PreSonus and Bitwig have teamed up to try to solve these problems. The companies have just announced a new file format intended for The DAWproject file format will only be supported by Bitwig Studio 5.0.9 and PreSonus Studio One 6.5 for now, but it could theoretically be adopted by other companies in the future. Bitwig and Presonus are calling this a DAW-agnostic platform and stress that they’ve taken steps to make the technology DAWproject files go much further than simple audio WAV data. The format keeps track of all relevant information across every track and channel in the entire project. This includes time data, audio information, automation, MIDI notes and plugin settings. All you have to do is save the song as a DAWproject file in one DAW and open it in another. That’s really it. The time data and automation stuff is really interesting, as this includes fades, time warping, transposition and other chores that are really annoying to re-do over and over again. There are some limitations, as you can only transfer between Bitwig Studio and PreSonus Studio One until other software applications get on board. Also, your plugins must already be installed in both DAWs to instantly transfer settings. Luckily, both DAWs support the VST plugin format, so there shouldn’t be an issue there. Apple has a as you can open up projects created in GarageBand right in Logic Pro X and maintain settings. The reverse, however, isn’t true, due to Logic Pro X being a much more robust application than GarageBand. Beyond that, this is the first universal project standard, well, ever. Here’s to hoping more popular DAWs like Protools, Ableton and Logic Pro X adopt this standard, or any standard, in the near future. This isn’t Bitwig’s first foray into the world of open-source audio. The company recently U-he to create CLAP, an open-source plugin format. DAWproject files are available for use starting today, so long as you use Bitwig Studio or PreSonus Studio One.
Software Applications
Details have emerged about a now-patched security flaw in Windows Common Log File System (CLFS) that could be exploited by an attacker to gain elevated permissions on compromised machines. Tracked as CVE-2022-37969 (CVSS score: 7.8), the issue was addressed by Microsoft as part of its Patch Tuesday updates for September 2022, while also noting that it was being actively exploited in the wild. "An attacker must already have access and the ability to run code on the target system," the company noted in its advisory. "This technique does not allow for remote code execution in cases where the attacker does not already have that ability on the target system." It also credited researchers from CrowdStrike, DBAPPSecurity, Mandiant, and Zscaler for reporting the vulnerability without delving into additional specifics surrounding the nature of the attacks. Now, the Zscaler ThreatLabz researcher team has disclosed that it captured an in-the-wild exploit for the then zero-day on September 2, 2022. "The cause of the vulnerability is due to the lack of a strict bounds check on the field cbSymbolZone in the Base Record Header for the base log file (BLF) in CLFS.sys," the cybersecurity firm said in a root cause analysis shared with The Hacker News. "If the field cbSymbolZone is set to an invalid offset, an out-of-bounds write will occur at the invalid offset." CLFS is a general-purpose logging service that can be used by software applications running in both user-mode or kernel-mode to record data as well as events and optimize log access. Some of the use cases associated with CLFS include online transaction processing (OLTP), network events logging, compliance audits, and threat analysis. According to Zscaler, the vulnerability is rooted in a metadata block called base record that's present in a base log file, which is generated when a log file is created using the CreateLogFile() function. "[Base record] contains the symbol tables that store information on the various client, container and security contexts associated with the Base Log File, as well as accounting information on these," according to Alex Ionescu, chief architect at Crowdstrike. As a result, a successful exploitation of CVE-2022-37969 via a specially crafted base log file could lead to memory corruption, and by extension, induce a system crash (aka blue screen of death or BSoD) in a reliable manner. That said, a system crash is just one of the outcomes that arises out of leveraging the vulnerability, for it could also be weaponized to achieve privilege escalation. Zscaler has further made available proof-of-concept (PoC) instructions to trigger the security hole, making it essential that users of Windows upgrade to the latest version to mitigate potential threats. Found this article interesting? Follow THN on Facebook, Twitter  and LinkedIn to read more exclusive content we post.
Software Applications
Published July 10, 2022 9:06AM Updated July 11, 2022 9:13AM James Webb Space Telescope launch NASA's James Webb Space Telescope launched successfully early Christmas morning from French Guiana on South America’s northeastern coast, riding a European Ariane rocket into the sky. Video: NASA NASA is preparing to show off what the James Webb Space Telescope is capable of when the space agency releases the first color images from the observatory before it begins scientific operations revealing the mysteries of the universe.  After launching on Christmas morning, the telescope's 6.5-meter mirror opened, and its tennis-court-size sunshield unfolded in space. The telescope is now stationed about 1 million miles from Earth and, after commissioning, is ready to begin science observations decades in the making.  NASA, the European Space Agency and Canadian Space Agency plan to release the first full-color images and spectroscopic data from the James Webb Space Telescope on Tuesday, July 12, at 10:30 a.m. ET. The reveal will air live online at NASA.gov and across the agency's social media platforms. Ahead of the full photo release, President Joe Biden will reveal one of JWST's first images on Monday at 5 p.m. ET during a preview at the White House. That event will also air on NASA.gov and NASA TV. The Large Magellanic Cloud, a small satellite galaxy fo the Milky Way, is first shown in images taken by NASA's Spitzer Space Telescope and then in infrared images by the James Webb Space Telescope. (Image credit: NASA/ESA/CSA/STScI; Spitzer: NASA/JP Consider this a friendly warning that these carefully planned cosmic images will be everywhere come Tuesday.  Already, Webb's imaging team has shared snippets of Webb's abilities, indicating the coming images will be something to talk about.  'NAILED IT': JAMES WEBB SPACE TELESCOPE'S FIRST IMAGE IS SOMETHING TO CELEBRATE  In April, the space agency and its telescope partners released the first image taken after completing "fine phasing" aligning the Optical Telescope Element.  Webb's team didn't choose the star called 2MASS J17554042+655127 for any scientific reason, explained NASA Webb operations scientist Jane Rigby. Still, even though the star was a hundred times fainter than the light a human eye could see, it was blindingly bright to Webb and a testament to the telescope's sensitivity. Then in May, the Webb science team shared an image of the Large Magellanic Cloud, a satellite galaxy of the Milky Way, used to test the telescope's Mid-Infrared Instrument or MIRI. The image below shows the same view taken by NASA's now-retired Spitzer Space Telescope's Infrared Array Camera and then by Webb's MIRI.   JULY'S FULL BUCK MOON WILL BE BRIGHTEST SUPERMOON OF THE YEAR "Spitzer taught us a lot, but this is like a whole new world, just unbelievably beautiful," Webb's Near-Infrared Camera principal investigator Marcia Rieke said in May. Shown here, the James Webb Space Telescope primary mirror illuminated in a dark cleanroom. (Image: NASA Goddard Space Flight Center) Ahead of the big reveal, NASA released a list of the cosmic targets for Webb's first images. According to the space agency, the objects were chosen by an international committee with representatives from NASA, ESA, CSA and the Space Telescope Science Institute.  The first color images by James Webb Space Telescope include the largest and brightest nebulae in the universe, the Carina Nebula, located 7,600 light-years away, and WASP-96 b, a gas exoplanet about 1,150 light-years away from Earth. The Southern Ring Nebula, an expanding cloud of gas surrounding a dying star, will also be featured in JWST's first data release. Finally, the compact galaxy group Stephan's Quintet, located in the Pegasus constellation, and a galaxy cluster known as SMACX 0723 will test the observatory's deep field view capabilities.  JWST mission managers say the telescope has enough fuel to continue operations for several decades because of the precise launch trajectory. Its predecessor, the Hubble Space Telescope, continues to operate after more than 30 years in orbit about 300 miles above Earth. NASA astronauts conducted several spacewalks to repair a flaw in Hubble's primary mirror after the first images came back blurry. CLICK HERE TO GET THE FOX WEATHER UPDATE PODCAST The James Webb Space Telescope observatory is about 1 million miles from Earth, meaning a repair mission would be out of the question. Thankfully, Webb's first images came back crystal clear. Check back on FOXweather.com and the FOX Weather app on July 12 to see the images everyone will be talking about.
Space Exploration
Astronomers have hailed the beginning of a new era of space observation after Nasa unveiled a flurry of full-colour images from the James Webb space telescope, the largest and most powerful space-based observatory ever built.The pictures from the sun-orbiting instrument brought delight – and no end of relief – for researchers who have waited decades for the project to come to fruition and embark on its mission to transform our view of the cosmos.The galaxy cluster SMACS 0723, known as Webb’s First Deep Field, in a composite made from images at different wavelengths taken with a near-infrared camera. Photograph: Nasa/ReutersAfter the first image was released at a White House briefing on Monday, the US space agency published further pictures from its Goddard Space Flight Center in Maryland on Tuesday amid cheers and howls of approval.The pictures provide a tantalising glimpse of the observatory’s potential to look back to the dawn of time, probe the deep structure of the universe, and allow the study of atmospheres wrapped around planets far beyond the solar system.“I am so thrilled and so relieved,” said Dr John Mather, Nasa’s senior project scientist on the mission. “This was so hard and it took so long. It’s impossible to convey how hard it really was … but we did it.”Dr Bill Ochs, Webb’s project manager, said the telescope was in “excellent shape” and meeting or exceeding its scientific requirements.The “deep field” image released on Monday showcased Webb’s ability to harness the gravitational forces of galaxy clusters to magnify far more distant galaxies behind them. The picture of the SMACS 0723 galaxy cluster, nearly 5bn light years away, brought galaxies into focus as they were more than 13bn years ago.The distinct signature of water, along with evidence for clouds and haze, in the atmosphere surrounding a hot, puffy gas giant planet orbiting a distant sun-like star. Photograph: Nasa/Getty ImagesAnalysis of light from one of the galaxies revealed its chemical makeup, a first for such a distant galaxy. “We’re seeing these galaxies in detail we’ve never been able to see before,” said Dr Jane Rigby, an operations project scientist on Webb.In the second image, Webb analysed starlight as it passed through the atmosphere of a hot Jupiter-like planet called Wasp-96b, about 1,150 light years away. This revealed the presence of water vapour, though the planet is too hot to harbour liquid water. Astronomers will use the same approach on smaller, rocky planets in the hope of finding worlds where conditions are ripe for life.A side-by-side comparison of observations of the Southern Ring nebula – in near-infrared light (left) and mid-infrared light (right). Photograph: Nasa/ESA/CSA/STScI/APFurther images captured the Southern Ring nebula, a vast cloud of gas hurtling away from a dying star about 2,000 light years from Earth. An unexpected streak in the image mystified some on the Nasa team. On closer inspection it was found to be another galaxy, viewed edge on.Perhaps more thrilling was the discovery in an image of Stephan’s Quintet, a tight cluster of five galaxies, of an active black hole. While the black hole itself cannot be seen, there is material swirling around it being swallowed by the cosmic monster.Stephan’s Quintet, a visual grouping of five galaxies. Photograph: Nasa/Getty ImagesThe final image, of a breathtaking stellar nursery called the Carina nebula, is so rich in detail that researchers could discern bubbles, cavities and jets blasting out of newborn stars, along with hundreds more stars they had never seen before. “We see structures that we don’t even know what they are,” said Dr Amber Straughn, a Nasa astrophysicist.The collection of deep space images mark the official start of science operations for Webb, which encountered major delays and cost overruns before it reached the launchpad. Since it blasted off in December, scientists have endured a nailbiting six months as the observatory has unfolded, deployed a sunshield the size of a tennis court, and aligned its 18 gold-plated mirrors en route to its destination 1m miles from Earth.A landscape of mountains and valleys speckled with glittering stars is actually the edge of a nearby, young, star-forming region called NGC 3324 in the Carina nebula. Photograph: Nasa/Getty Images“The unprecedented detail and resolution of the images will be transformational for astronomy and provide a much deeper understanding of the universe than we currently have,” said Martin Barstow, a professor of astrophysics and space science at the University of Leicester.Developed in collaboration with the European and Canadian space agencies, Webb uses a 6.5-metre primary mirror to detect the feeble glimmer of some of the oldest and most distant stars in the cosmos.Sign up to First Edition, our free daily newsletter – every weekday at 7am BSTBecause the universe is expanding, light from distant objects is stretched out, “redshifting” it to longer wavelengths. When visible light is stretched into the infra-red, it can be detected by Webb’s instruments, which are three times sharper and 100 times more sensitive that those on Hubble.“The performance is superb,” said Prof Gillian Wright, director of the UK Astronomy Technology Centre in Edinburgh and principal investigator for the mid-infrared (Miri) instrument on Webb. “We’ll be able to do all the science we want to do and more. When we’ve talked about being able to directly image planets around other stars, we know we can do that now.“The job now is choosing which stars to look at and which planets to take images of, not whether the telescope is capable or not of doing it. It’s more than capable of doing that kind of science, superbly well.”
Space Exploration
Modern astronomy is giving us unprecedented views of the asteroids, comets, and other small bodies that litter our cosmic home. These planetary leftovers offer clues to our creation—and potential destruction.In 2015 comet C/2014 Q2 Lovejoy—seen here in a two-photo mosaic—neared the sun for the first time in millennia. Lovejoy likely hails from the Oort cloud, a distant shell of icy objects thought to surround the solar system. It’s one of the roughly 4,000 known comets among the billions estimated to exist in our cosmic backyard.Photograph by VELIMIR POPOV AND EMIL IVANOV AT THE IRIDA OBSERVATORYWhy are people so dang obsessed with Mars?How viruses shape our worldThe era of greyhound racing in the U.S. is coming to an endSee how people have imagined life on Mars through historySee how NASA’s new Mars rover will explore the red planetWhy are people so dang obsessed with Mars?How viruses shape our worldThe era of greyhound racing in the U.S. is coming to an endSee how people have imagined life on Mars through historySee how NASA’s new Mars rover will explore the red planetWhy are people so dang obsessed with Mars?How viruses shape our worldThe era of greyhound racing in the U.S. is coming to an endSee how people have imagined life on Mars through historySee how NASA’s new Mars rover will explore the red planet
Space Exploration
NASA Reveals More Incredible Pics from Space Thanks to Webb Telescope 7/12/2022 8:55 AM PT NASA's James Webb Telescope continues to stun our world with images of galaxies and stars from billions of light-years away ... just releasing a new set of pics that look like they're straight out of a sci-fi movie. On Tuesday, viewers got a look at the Southern Ring planetary nebula (NGC 3132) and its two stars. It's truly incredible how clear and defined the images of the two stars are ... some of the first of their kind. The new images weren't just a showcase of NGC 3132, though, we also got a look at five galaxies known as Stephan’s Quintet, which NASA describes as, "These colliding galaxies are pulling and stretching each other in a gravitational dance." Waiting for your permission to load the Instagram Media. The Carina Nebula was Tuesday's "Grand Finale" for the Webb images ... NASA says, "Webb’s new view gives us a rare peek into stars in their earliest, rapid stages of formation. For an individual star, this period only lasts about 50,000 to 100,000 years.⁣⁣" NASA Of course, Monday's sneak peek from NASA of what is "The deepest, sharpest infrared image of the universe ever.⁣" promised to deliver ... and it certainly already has. Waiting for your permission to load the Instagram Media. Webb is a collab between NASA, the European Space Agency and the Canadian Space agency.
Space Exploration
It will take another four months for the satellite to reach the moon, as it cruises along using minimal energy.Rocket Lab's Electron rocket waits on the launch pad on the Mahia peninsula in New Zealand, on June 28, 2022.Rocket Lab / APJuly 5, 2022, 1:58 PM UTC / Source: Associated PressA satellite the size of a microwave oven successfully broke free from its orbit around Earth on Monday and is headed toward the moon, the latest step in NASA’s plan to land astronauts on the lunar surface again.It’s been an unusual journey already for the Capstone satellite. It was launched six days ago from New Zealand’s Mahia Peninsula by the company Rocket Lab in one of their small Electron rockets. It will take another four months for the satellite to reach the moon, as it cruises along using minimal energy.Rocket Lab founder Peter Beck told The Associated Press it was hard to put his excitement into words.“It’s probably going to take a while to sink in. It’s been a project that has taken us two, two-and-a-half years and is just incredibly, incredibly difficult to execute,” he said. “So to see it all come together tonight and see that spacecraft on its way to the moon, it’s just absolutely epic.”Beck said the relatively low cost of the mission — NASA put it at $32.7 million — marked the beginning of a new era for space exploration.“For some tens of millions of dollars, there is now a rocket and a spacecraft that can take you to the moon, to asteroids, to Venus, to Mars,” Beck said. “It’s an insane capability that’s never existed before.”Rocket Lab Electron's second stage propels Photon and Capstone into its first parking orbit.Business Wire / via APIf the rest of the mission is successful, the Capstone satellite will send back vital information for months as the first to take a new orbit around the moon called a near-rectilinear halo orbit: a stretched-out egg shape with one end of the orbit passing close to the moon and the other far from it.Eventually, NASA plans to put a space station called Gateway into the orbital path, from which astronauts can descend to the moon’s surface as part of its Artemis program.Beck said the advantage of the new orbit is that it minimizes fuel use and allows the satellite — or a space station — to stay in constant contact with Earth.The Electron rocket that launched June 28 from New Zealand was carrying a second spacecraft called Photon, which separated after nine minutes. The satellite was carried for six days in Photon, with the spacecraft’s engines firing periodically to raise its orbit farther and farther from Earth.A final engine burst Monday allowed Photon to break from Earth’s gravitational pull and send the satellite on its way. The plan now is for the 25-kilogram (55-pound) satellite to far overshoot the moon before falling back into the new lunar orbit Nov. 13. The satellite will use tiny amounts of fuel to make a few planned trajectory course corrections along the way.Beck said they would decide over the coming days what to do with Photon, which had completed its tasks and still had a bit of fuel left in the tank.“There’s a number of really cool missions that we can actually do with it,” Beck said.For the mission, NASA teamed up with two commercial companies: California-based Rocket Lab and Colorado-based Advanced Space, which owns and operates the Capstone satellite.
Space Exploration
Our view of the universe just expanded: The first image from NASA’s new space telescope unveiled Monday is brimming with galaxies and offers the deepest look of the cosmos ever captured.The first image from the $10 billion James Webb Space Telescope is the farthest humanity has ever seen in both time and distance, closer to the dawn of time and the edge of the universe. That image will be followed Tuesday by the release of four more galactic beauty shots from the telescope’s initial outward gazes.The “deep field” image released at a White House event is filled with lots of stars, with massive galaxies in the foreground and faint and extremely distant galaxies peeking through here and there. Part of the image is light from not too long after the Big Bang, which was 13.8 billion years ago.Seconds before he unveiled it, President Joe Biden marveled at the image he said showed “the oldest documented light in the history of the universe from over 13 billion -- let me say that again -- 13 billion years ago. It’s hard to fathom.”The busy image with hundreds of specks, streaks, spirals and swirls of white, yellow, orange and red is only “one little speck of the universe,” NASA Administrator Bill Nelson said. The pictures on tap for Tuesday include a view of a giant gaseous planet outside our solar system, two images of a nebula where stars are born and die in spectacular beauty and an update of a classic image of five tightly clustered galaxies that dance around each other. The world’s biggest and most powerful space telescope rocketed away last December from French Guiana in South America. It reached its lookout point 1 million miles (1.6 million kilometers) from Earth in January. Then the lengthy process began to align the mirrors, get the infrared detectors cold enough to operate and calibrate the science instruments, all protected by a sunshade the size of a tennis court that keeps the telescope cool.The plan is to use the telescope to peer back so far that scientists will get a glimpse of the early days of the universe about 13.7 billion years ago and zoom in on closer cosmic objects, even our own solar system, with sharper focus. Webb is considered the successor to the highly successful, but aging Hubble Space Telescope. Hubble has stared as far back as 13.4 billion years. It found the light wave signature of an extremely bright galaxy in 2016. Astronomers measure how far back they look in light-years with one light-year being 5.8 trillion miles (9.3 trillion kilometers).“Webb can see backwards in time to just after the Big Bang by looking for galaxies that are so far away that the light has taken many billions of years to get from those galaxies to our telescopes,” said Jonathan Gardner, Webb’s deputy project scientist said during the media briefing. How far back did that first image look? Over the next few days, astronomers will do intricate calculations to figure out just how old those galaxies are, project scientist Klaus Pontoppidan said last month. “The image is spectacularly deeper (than a similar one taken by Hubble), but it’s unclear how far back we’re looking,″ Richard Ellis, professor of astrophysics at University College London, said by email. ”More info is needed.”The deepest view of the cosmos “is not a record that will stand for very long,” Pontoppidan said, since scientists are expected to use the Webb telescope to go even deeper.Thomas Zurbuchen, NASA’s science mission chief said when he saw the images he got emotional and so did his colleagues: “It’s really hard to not look at the universe in new light and not just have a moment that is deeply personal.” At 21 feet (6.4 meters), Webb’s gold-plated, flower-shaped mirror is the biggest and most sensitive ever sent into space. It’s comprised of 18 segments, one of which was smacked by a bigger than anticipated micrometeoroid in May. Four previous micrometeoroid strikes to the mirror were smaller. Despite the impacts, the telescope has continued to exceed mission requirements, with barely any data loss, according to NASA.NASA is collaborating on Webb with the European and Canadian space agencies.“I’m now really excited as this dramatic progress augurs well for reaching the ultimate prize for many astronomers like myself: pinpointing “Cosmic Dawn” — the moment when the universe was first bathed in starlight,” Ellis said.___AP Aerospace Writer Marcia Dunn contributed.___The Associated Press Health and Science Department receives support from the Howard Hughes Medical Institute’s Department of Science Education. The AP is solely responsible for all content.
Space Exploration
ScienceThe newly released full-color image highlights a stunning collection of ancient galaxies—and heralds a new age for astronomy.After a million-mile journey into space, NASA’s newest flagship observatory, the James Webb Space Telescope, has captured its first suite of full-color images of the universe. And in a special preview event today, U.S. President Joe Biden has unveiled one of them, in which hundreds—if not thousands—of distant galaxies dapple an inky cosmic sea.“It’s a new window into the history of our universe,” Biden said during the event. “And today we’re going to get a glimpse of the first light to shine through that window.”The picture is JWST’s first shot at what astronomers call a deep-field image, when the telescope takes a long look at a tiny patch of space, collecting dim light and revealing extremely distant objects. As seen through the instrument’s sharp, infrared eye, that little patch is populated by swirling, glowing, gorgeous galaxies, some of which existed more than 13 billion years ago, when the universe was still a toddler.“You do see the deepest view of the universe, ever, in that picture,” says NASA’s Thomas Zurbuchen, associate administrator of the science mission directorate.At least four additional images will be released on July 12, offering new views of colliding galaxies, the final exhales of a dying star, a massive stellar nursery, and the spectrum of an alien world.Launched on December 25, 2021, JWST is the most powerful telescope to ever take flight. When scientists first envisioned it decades ago, they imagined a telescope that would be able to peer back to the earliest beginnings of the universe, when the first stars and galaxies were emerging from the cosmic murk. To do that, the $10-billion observatory sees the sky in infrared light, or wavelengths that are slightly longer than what human eyes can perceive. Once the images are on the ground, they’re colored using a palette that corresponds to the different infrared wavelengths.Over the years, multiple delays, mistakes during assembly, budget overruns, and an ongoing controversy about the man for whom the telescope is named plagued JWST’s journey to space. But once there, the telescope successfully performed a complex deployment routine with hundreds of tricky steps. The observatory’s 21-foot-diameter mirror unfolded, a multi-layer sunshield unfurled, and the instruments cooled to nearly absolute zero.Now, with these first images in hand, it’s clear that JWST is working perhaps even better than expected—and that its next 20 years of science operations will be stuffed with surprises.A look back in timeThe deep-field image released today is, in some ways, analogous to traveling through time. It offers a glimpse of the distant past, when early galaxies were just growing up.Zurbuchen describes it as an “action shot” because the light from those seemingly countless background galaxies is amplified and distorted by the immense gravity of a galactic cluster—called SMACS 0723—in the foreground. That massive cluster, which is four billion light-years away from Earth, acts like a magnifying lens, allowing the light from extremely old, much more distant galaxies to pop into view.“It’s clear that the light finds a really complex way to us,” Zurbuchen says. “I think it’s almost overwhelmingly beautiful, knowing that the photons you’re imaging here have been in space, on the way to this camera, for over 13 billion years. I think it just takes your breath away.”This isn’t the first time scientists have aimed a telescope at a patch of space and waited to see what turned up. In 1995 the Hubble Space Telescope stared at a seemingly empty patch of sky for a hundred hours. That effort produced one of the most revolutionary images in science: a galaxy-studded pocket of space that profoundly altered conceptions of how the universe is populated.Hubble continued to produce deeper and deeper images, stretching the capability of the telescope to see into the early universe. Similarly, JWST will produce ever-deeper snapshots of the cosmos, coaxing secrets from the darkness and unveiling realities that humans have never seen before—and maybe never even imagined.“Now we enter a new phase of scientific discovery. Building on the legacy of Hubble, the James Webb Space Telescope allows us to see deeper into space than ever before, and in stunning clarity,” Vice President Kamala Harris, chair of the National Space Council, said during the briefing. “It will enhance what we know about the origins of our universe, our solar system, and possibly life itself.”
Space Exploration
By Matt WilliamsFor decades, scientists have been speculating that life could exist in beneath the icy surface of Jupiter’s moon Europa. Thanks to more recent missions (like the Cassini spacecraft), other moons and bodies have been added to this list as well – including Titan, Enceladus, Dione, Triton, Ceres and Pluto. In all cases, it is believed that this life would exist in interior oceans, most likely around hydrorthermal vents located at the core-mantle boundary. One problem with this theory is that in such undersea environments, life might have a hard time getting some of the key ingredients it would need to thrive. However, in a recent study – which was supported by the NASA Astrobiology Institute (NAI) – a team of researchers ventured that in the outer Solar System, the combination of high-radiation environments, interior oceans and hydrothermal activity could be a recipe for life.The study, titled “The Possible Emergence of Life and Differentiation of a Shallow Biosphere on Irradiated Icy Worlds: The Example of Europa“, recently appeared in the scientific journal Astrobiology. The study was led by Dr. Michael Russell with the support of Alison Murray of the Desert Research Instituteand Kevin Hand – also a researcher with NASA JPL.For the sake of their study, Dr. Russell and his colleagues considered how the interaction between alkaline hydrothermal springs and sea water is often considered to be how the key building blocks for life emerged here on Earth. However, they emphasize that this process was also dependent on energy provided by our Sun. The same process could have happened on moon’s like Europa, but in a different way. As they state in their paper: “[T]he significance of the proton and electron flux must also be appreciated, since those processes are at the root of life’s role in free energy transfer and transformation. Here, we suggest that life may have emerged on irradiated icy worlds such as Europa, in part as a result of the chemistry available within the ice shell, and that it may be sustained still, immediately beneath that shell.”In the case of moon’s like Europa, hydrothermal springs would be responsible for churning up all the necessary energy and ingredients for organic chemistry to take place. Ionic gradients, such as oxyhydroxides and sulfides, could drive the key chemical processes – where carbon dioxide and methane are hydrogenated and oxidized, respectively – which could lead to the creation of early microbial life and nutrients.At the same time, the heat from hydrothermal vents would push these microbes and nutrients upwards towards the icy crust. This crust is regularly bombarded by high-energy electrons created by Jupiter’s powerful magnetic field, a process which creates oxidants. As scientists have known for some time from surveying Europa’s crust, there is a process of exchange between the moon’s interior ocean and its surface. As Dr. Russell and his colleagues indicate, this action would most likely involve the plume activity that has been observed on Europa’s surface, and could lead to a network of ecosystems on the underside of Europa’s icy crust:“Models for transport of material within Europa’s ocean indicate that hydrothermal plumes could be well constrained within the ocean (primarily by the Coriolis force and thermal gradients), leading to effective delivery through the ocean to the ice-water interface. Organisms fortuitously transported from hydrothermal systems to the ice-water interface along with unspent fuels could potentially access a larger abundance of oxidants directly from the ice. Importantly, oxidants might only be available where the ice surface has been driven to the base of the ice shell.”As Dr. Russel indicated in an interview with Astrobiology Magazine, microbes on Europa could reach densities similar to what has been observed around hydrothermal vents here on Earth, and may bolster the theory that life on Earth also emerged around such vents. “All the ingredients and free energy required for  life are all focused in one place,” he said. “If we were to find life on Europa, then that would strongly support the submarine alkaline vent theory.” This study is also significant when it comes to mounting future missions to Europa. If microbial ecosystems exist on the undersides of Europa’s icy crust, then they could be explored by robots that are able to penetrate the surface, ideally by traveling down a plume tunnel. Alternately, a lander could simply position itself near an active plume and search for signs of oxidants and microbes coming up from the interior.Similar missions could also be mounted to Enceladus, where the presence of hydrothermal vents has already been confirmed thanks to the extensive plume activity observed around its southern polar region. Here too, a robotic tunneler could enter surface fissures and explore the interior to see if ecosystems exist on the underside of the moon’s icy crust. Or a lander could position itself near the plumes and examine what is being ejected.Such missions would be simpler and less likely to cause contamination than robotic submarines designed to explore Europa’s deep ocean environment. But regardless of what form a future mission to Europa, Enceladus, or other such bodies takes, it is encouraging to know that any life that may exist there could be accessible. And if these missions can sniff it out, we will finally know that life in the Solar System evolved in places other than Earth!Source: Universe Today - Further Reading: Astrobiology Magazine, Astrobiology
Space Exploration
NASA is scheduled to release some of the very first images taken by the James Webb Space Telescope on July 12, 2022. They’ll mark the beginning of the next era in astronomy as Webb – the largest space telescope ever built – offers scientific data that will help answer questions about the earliest moments of the universe and allow astronomers to study exoplanets in greater detail than ever before. NASA is expected to begin its coverage of the new images at 9:45 a.m. EDT. Watch live in our player above. But it has taken nearly eight months of travel, setup, testing and calibration to make sure this most valuable of telescopes is ready for prime time. Marcia Rieke, an astronomer at the University of Arizona and the scientist in charge of one of Webb’s four cameras, explains what she and her colleagues have been doing to get this telescope up and running. READ MORE: Here’s the deepest, clearest infrared image of the universe ever produced What’s happened since the telescope launched? After the successful launch of the James Webb Space Telescope on Dec. 25, 2021, the team began the long process of moving the telescope into its final orbital position, unfolding the telescope and – as everything cooled – calibrating the cameras and sensors onboard. The launch went as smoothly as a rocket launch can go. One of the first things my colleagues at NASA noticed was that the telescope had more remaining fuel onboard than predicted to make future adjustments to its orbit. This will allow Webb to operate for much longer than the mission’s initial 10-year goal. The first task during Webb’s monthlong journey to its final location in orbit was to unfold the telescope. This went along without any hitches, starting with the white-knuckle deployment of the sun shield that helps cool the telescope, followed by the alignment of the mirrors and the turning on of sensors. WATCH: Biden offers first peek of historic image from James Webb Space Telescope Once the sun shield was open, our team began monitoring the temperatures of the four cameras and spectrometers onboard, waiting for them to reach temperatures low enough so that we could start testing each of the 17 different modes in which the instruments can operate. The NIRCam on Webb was the first instrument to go online and helped align the 18 mirror segments. NASA Goddard Space Center/Wikimedia Commons What did you test first? The cameras on Webb cooled just as the engineers predicted, and the first instrument the team turned on was the Near Infrared Camera – or NIRCam. NIRCam is designed to study the faint infrared light produced by the oldest stars or galaxies in the universe. But before it could do that, NIRCam had to help align the 18 individual segments of Webb’s mirror. Once NIRCam cooled to minus 280 F, it was cold enough to start detecting light reflecting off of Webb’s mirror segments and produce the telescope’s first images. The NIRCam team was ecstatic when the first light image arrived. We were in business! These images showed that the mirror segments were all pointing at a relatively small area of the sky, and the alignment was much better than the worst-case scenarios we had planned for. Webb’s Fine Guidance Sensor also went into operation at this time. This sensor helps keep the telescope pointing steadily at a target – much like image stabilization in consumer digital cameras. Using the star HD84800 as a reference point, my colleagues on the NIRCam team helped dial in the alignment of the mirror segments until it was virtually perfect, far better than the minimum required for a successful mission. What sensors came alive next? As the mirror alignment wrapped up on March 11, the Near Infrared Spectrograph – NIRSpec – and the Near Infrared Imager and Slitless Spectrograph – NIRISS – finished cooling and joined the party. NIRSpec is designed to measure the strength of different wavelengths of light coming from a target. This information can reveal the composition and temperature of distant stars and galaxies. NIRSpec does this by looking at its target object through a slit that keeps other light out. WATCH: NASA’s James Webb telescope poised to launch new golden age of astronomy NIRSpec has multiple slits that allow it to look at 100 objects at once. Team members began by testing the multiple targets mode, commanding the slits to open and close, and they confirmed that the slits were responding correctly to commands. Future steps will measure exactly where the slits are pointing and check that multiple targets can be observed simultaneously. NIRISS is a slitless spectrograph that will also break light into its different wavelengths, but it is better at observing all the objects in a field, not just ones on slits. It has several modes, including two that are designed specifically for studying exoplanets particularly close to their parent stars. So far, the instrument checks and calibrations have been proceeding smoothly, and the results show that both NIRSpec and NIRISS will deliver even better data than engineers predicted before launch. The MIRI camera, image on the right, allows astronomers to see through dust clouds with incredible sharpness compared with previous telescopes like the the Spitzer Space Telescope, which produced the image on the left. NASA/JPL-Caltech (left), NASA/ESA/CSA/STScI (right)/Flickr, CC BY What was the last instrument to turn on? The final instrument to boot up on Webb was the Mid-Infrared Instrument, or MIRI. MIRI is designed to take photos of distant or newly formed galaxies as well as faint, small objects like asteroids. This sensor detects the longest wavelengths of Webb’s instruments and must be kept at minus 449 F – just 11 degrees F above absolute zero. If it were any warmer, the detectors would pick up only the heat from the instrument itself, not the interesting objects out in space. MIRI has its own cooling system, which needed extra time to become fully operational before the instrument could be turned on. Radio astronomers have found hints that there are galaxies completely hidden by dust and undetectable by telescopes like Hubble that captures wavelengths of light similar to those visible to the human eye. The extremely cold temperatures allow MIRI to be incredibly sensitive to light in the mid-infrared range which can pass through dust more easily. When this sensitivity is combined with Webb’s large mirror, it allows MIRI to penetrate these dust clouds and reveal the stars and structures in such galaxies for the first time. What’s next for Webb? As of June 15, 2022, all of Webb’s instruments are on and have taken their first images. Additionally, four imaging modes, three time series modes and three spectroscopic modes have been tested and certified, leaving just three to go. On July 12, NASA plans to release a suite of teaser observations that illustrate Webb’s capabilities. These will show the beauty of Webb imagery and also give astronomers a real taste of the quality of data they will receive. After July 12, the James Webb Space Telescope will start working full time on its science mission. The detailed schedule for the coming year hasn’t yet been released, but astronomers across the world are eagerly waiting to get the first data back from the most powerful space telescope ever built. This article is republished from The Conversation under a Creative Commons license. Read the original article.
Space Exploration
Two fascinating pictures of the gas giant Jupiter were leaked alongside the main images from the James Webb Space Telescope (JWST) released yesterday.The images are not formal, full-resolution pictures released in the style of the main releases yesterday, but were included in a NASA commissioning document to show that its NIRCam (Near Infrared Camera) can track moving targets. The composite shows a short-wavelength image of Jupiter on the left, and a long-wavelength image on the right, revealing the kinds of dramatically different atmospheric conditions that the JWST is able to spot.Both images were taken with a 75-second exposure and they show the gas giant's moons Europa, Thebe, and Metis. NASA notes that the shadow of Europa is also visible to the left of the Great Red Spot.Read more: Why are these pictures such a big deal? Image: The 'cosmic cliffs' of the Carina Nebula, as seen by the JWST NASA yesterday released a full set of images from its James Webb Space Telescope, showing what is said to be is the "deepest" and most detailed picture of the cosmos to date. A deep field image of a cluster of distant galaxies as they looked billions of years ago - from the very earliest days of the universe - was first released by NASA alongside the president of the United States.The second image was an analysis of the atmosphere of a giant planet called WASP-96b, and is the first "spectrum analysis" of an exoplanet's atmosphere.It was an example of the JWST using a technique called transmission spectroscopy to observe starlight filtered through planetary atmospheres.Because the molecules in the atmosphere absorb particular wavelengths of light, whatever gets filtered through will reveal the chemical compositions of those atmospheres, and potentially indicate if the planet is capable of harbouring life.The third image, another in infrared caught by the NIRCam, showed the Southern Ring Nebula created by a dying star, measuring nearly half a light-year in diameter.The penultimate image showed a group of five galaxies known as Stephan's Quintet, although only four of them are truly interacting - one is simply in the foreground.Many commentators suggested that NASA left the best until last with its beautiful picture of the cosmic cliffs of the Carina Nebula.Josef Aschbacher, the director general of the European Space Agency - a partner to NASA with the James Webb Space Telescope - said it moved him to tears.See the pictures: Astonishing images show universe 'as we've never seen it before' Image: The Southern Ring Nebula, as seen by the JWST's NIRCam Telescope's missionA partnership of scientists and engineers was formed between NASA, the European Space Agency and the Canadian Space Agency - and for 20 years they worked to complete the £8.4bn telescope.On Christmas Day, 2021, the Webb was launched and it reached its destination in solar orbit nearly one million miles from Earth a month later.Once there, the telescope underwent a months-long process to unfurl all of its components, including a sun shield the size of a tennis court, and to align its mirrors and calibrate its instruments.The universe has been expanding for 13.8 billion years, meaning the light from the first stars and galaxies has been "stretched" from shorter visible wavelengths to longer infrared ones.This is what allows Webb to see the universe in unprecedented new detail.These pictures are the first of millions the new telescope will produce over its 20-year lifetime.Each full-colour, high-resolution picture that was unveiled on Tuesday took weeks to render from raw telescope data.Watch parties for the picture release took place all over the world including in the US, Canada, Israel, UK and Europe.
Space Exploration
Topline NASA lost contact Tuesday with a satellite on its way to study the moon’s orbit—a critical project as the agency seeks to launch its first crewed moon landing in over five decades—but scientists are working to get back in touch with the nearly $33 million spacecraft. NASA lost communications with its CAPSTONE spacecraft Tuesday afternoon, one week after it took off ... [+] from New Zealand. (Photo by NASA/Newsmakers) Getty Images Key Facts In a statement released Tuesday, NASA said it is still working to figure out the cause of the communications issue, but the satellite has enough fuel for "several days," buying the agency enough time to correct its course. Advanced Space, a Colorado-based startup that developed and operates the $32.7 million satellite, first lost contact with the spacecraft Tuesday through a radio antenna platform called the Deep Space Network, according to the Associated Press. The microwave-sized satellite—called the Cislunar Autonomous Positioning System Technology Operations and Navigation Experiment (CAPSTONE)—was launched last week from New Zealand to study the moon’s elongated orbit, eventually bringing it within 1,000 miles of the moon’s lunar pole over a four-month journey, according to NASA. Key Background The CAPSTONE project is designed to create an orbiting outpost for NASA’s Artemis missions, which aim to stage a first crewed moon landing by 2025—marking the first humans to visit the moon since the final Apollo mission in 1972. NASA said the mission will not only study the moon’s orbit, but will also create tens of thousands of jobs, “establish the first long-term presence on the moon” and help the agency understand what it would take to “establish a community” on Mars. The space agency says the Artemis missions will also include the first moon landing by a woman and by a person of color. Further Reading NASA: Contact lost with spacecraft on way to test moon orbit (Associated Press) NASA lost contact with a satellite after it broke free of the Earth's orbit (CNN) Artemis 1: In 100 Days NASA’s Long-Awaited Moon Mission Could Blast-Off. Here’s Everything You Need To Know (Forbes)
Space Exploration
The James Webb Space Telescope's first images aren't just breathtaking -- they contain a wealth of scientific insights and clues that researchers are eager to pursue.Here are some of the things scientists now hope to learn.- Into the deep -Webb's first image, released Monday, delivered the deepest and sharpest infrared image of the distant universe so far, "Webb's First Deep Field."The white circles and ellipses are from the galaxy cluster in the foreground called SMACS 0723, as it appeared more than 4.6 billion years ago -- roughly when our Sun formed too.The reddish arcs are from light from ancient galaxies that has traveled more than 13 billion years, bending around the foreground cluster, which acts as a gravitational lens.NASA astrophysicist Amber Straughn said she was struck by "the astounding detail that you can see in some of these galaxies.""They just pop out! There is so much more detail, it's like seeing in high-def."Plus, added NASA astrophysicist Jane Rigby, the image can teach us more about mysterious dark matter, which is thought to comprise 85 percent of matter in the universe -- and is the main cause of the cosmic magnifying effect.The composite image, which required a 12.5 hour exposure time, is considered a practice run. Given longer exposure time, Webb should break all-time distance records by gazing back to the first few hundred million years after the Big Bang, 13.8 billion years ago.- The hunt for habitable planets -Webb captured the signature of water, along with previously undetected evidence of clouds and haze, in the atmosphere surrounding a hot, puffy gas giant planet called WASP-96 b that orbits a distant star like our Sun.The telescope achieved this by analyzing starlight filtered through the planet's atmosphere as it moves across the star, to the unfiltered starlight detected when the planet is beside the star -- a technique called spectroscopy that no other instrument can do at the same detail.WASP-96 b is one of more than 5,000 confirmed exoplanets in the Milky Way. But what really excites astronomers is the prospect of pointing Webb at smaller, rocky worlds, like our own Earth, to search for atmospheres and bodies of liquid water that could support life.- Death of a star -Webb's cameras captured a stellar graveyard, in the Southern Ring Nebula, revealing the dim, dying star at its center in clear detail for the first time, and showing that it is cloaked in dust.Astronomers will use Webb to delve deeper into specifics about "planetary nebulae" like these, which spew out clouds of gas and dust.These nebulae will eventually also lead to rebirth.The gas and cloud ejection stops after some tens of thousands of years, and once the material is scattered in space, new stars can form.- A cosmic dance -Stephan's Quintet, a grouping of five galaxies, is located in the constellation Pegasus.Webb was able to pierce through the clouds of dust and gas at the center of the galaxy to glean new insights, such as the velocity and composition of outflows of gas near its supermassive black hole.Four of the galaxies are close together and locked in a "cosmic dance" of repeated close encounters.By studying it, "you learn how the galaxies collide and merge," said cosmologist John Mather, adding our own Milky Way was probably assembled out of 1,000 smaller galaxies.Understanding the black hole better will also give us greater insights into Sagittarius A*, the black hole at the center of the Milky Way, which is shrouded in dust.- Stellar nursey -Perhaps the most beautiful image is that of the "Cosmic Cliffs" from the Carina Nebula, a stellar nursery.Here, for the first time, Webb has revealed previously invisible regions of star formation, which will tell us more about why stars form with certain mass, and what determines the number that form in a certain region.They may look like mountains, but the tallest of the craggy peaks are seven light years high, and the yellow structures are made from huge hydrocarbon molecules, said Webb project scientist Klaus Pontoppidan.In addition to being the stuff of stars, nebular material could also be where we come from."This may be the way that the universe is transporting carbon, the carbon that we're made of, to planets that may be habitable for life," he said.- The great unknown -Perhaps most exciting of all is journeying into the unknown, said Straughn.Hubble played a key role in discovering that dark energy is causing the universe to expand at an ever-growing rate, "so it's hard to imagine what we might learn with this 100 times more powerful instrument."ia/wd
Space Exploration
This oddly-shaped galactic spectacle is bursting with brand new stars. The pink fireworks in this image taken with the NASA/ESA Hubble Space Telescope are regions of intense star formation, triggered by a cosmic-scale collision. The huge galaxy in this image, NGC 4490, has a smaller galaxy in its gravitational grip and is feeling the strain.Follow us on social media: Compared to the other fundamental forces in the Universe, gravity is fairly weak. Despite this, gravity has an influence over huge distances and is the driving force behind the motions of the most massive objects in the cosmos. The scattered and warped appearance of the galaxy in this image, NGC 4490, is a prime example of the results of gravity’s unrelenting tug.Over millions of years, the mutual gravitational attraction between NGC 4490 and its smaller neighbour, NGC 4485, has dragged the two galaxies closer. Eventually, they collided in a swirling crush of stars, gas, and dust. In this image, this most intense period is already over and the two galaxies have moved through each other, untangled themselves, and are speeding apart again. But gravity’s pull is relentless; the galaxies are likely to collide again within a few billion years.Together NGC 4490 and NGC 4485 form the system Arp 269, which is featured in the Atlas of Peculiar Galaxies. They are located 24 million light-years from Earth in the constellation of Canes Venatici (The Hunting Dogs). The extreme tidal forces of their interaction have determined the shapes and properties of the two galaxies. Once a barred spiral galaxy, similar to the Milky Way, NGC 4490’s outlying regions have been stretched out, resulting in its nickname of the Cocoon Galaxy. Virtually no trace of its past spiral structure can be seen from our perspective, although its companion galaxy NGC 4485 — not pictured here — still clings on to its spiral arms. This cosmic collision has created rippling patches of higher density gas and dust within both galaxies. The conditions there are ripe for star formation; the brilliant pink pockets of light seen here are dense clouds of ionised hydrogen, glowing as they are irradiated with ultraviolet light from nearby young, hot stars. This spectacular burst of new activity has led to NGC 4490’s classification as a starburst galaxy.Star formation is also evident in the thin thread that connects the two galaxies: a bridge of stars created by the ancient crash, stretching over the 24 000 light-years that currently separate the fated pair. But where there is life, there is also death. Several supernovae have also been spotted in NGC 4490 over the past few decades, including SN 1982F and SN 2008ax.Source: Hubble space telescope press release
Space Exploration
NASA’s $10 billion telescope peers deeper into space than ever, revealing previously undetectable details in the cosmos. July 12, 2022, 9:17 PM UTCThe first images from the James Webb Space Telescope are just a preview of the impressive capabilities of NASA’s $10 billion, next-generation observatory. Billed as the successor to the iconic Hubble Space Telescope, which launched into orbit in 1990, Webb was designed to peer deeper into space than ever before, with powerful instruments that can capture previously undetectable details in the cosmos. Here’s how the Webb telescope stacks up to its famous predecessor.Carina NebulaThe Carina Nebula is an active star-forming region located roughly 7,600 light-years away in the constellation Carina. Hubble’s view of the stellar nursery was already stunning, but Webb’s infrared cameras are able to pierce through cosmic dust, revealing previously invisible areas where new stars are being born.Southern Ring NebulaNASA officials likened the Southern Ring Nebula, an expanding shell of gas surrounding a star in its final throes, to the last “performance” of a dying star. The Webb telescope captured features of the Southern Ring Nebula in exquisite new detail, including rings of gas and dust expelled in all directions by the dimmer of two stars at its center.Stephan’s QuintetBoth Hubble and Webb snapped images of a distant group of five galaxies known as Stephan’s Quintet. This band of galaxies is located nearly 300 million light-years away in the constellation Pegasus. Webb’s mosaic reveals some never-before-seen details, including bundles of young stars, active starburst regions and huge shock waves as one of the galaxies smashes through the cluster.SMACS 0723Among the Webb telescope’s first images was a spectacular view of a galaxy cluster known as SMACS 0723. Thousands of bright galaxies can be seen speckled across a small patch of sky, including extremely distant celestial objects from the early days of the universe. By comparing Hubble’s and Webb’s infrared images of SMACS 0723, it’s possible to see how the Webb telescope will be able to peer deeper into the universe than ever before, bringing some of the faintest objects in the cosmos into sharp, new focus.Denise Chow is a reporter for NBC News Science focused on general science and climate change.Jiachuan Wu is a national interactive journalist for NBC News Digital.
Space Exploration
A spinning neutron star periodically swings its radio (green) and gamma-ray (magenta) beams past Earth in this artist's concept of a black widow pulsar. The pulsar heats the facing side of its stellar partner to temperatures twice as hot as the sun's surface and slowly evaporates it. NASA/Goddard Space Flight Center/Handout via REUTERS Register now for FREE unlimited access to Reuters.comWASHINGTON, July 29 (Reuters) - Astronomers have observed the most massive known example of an object called a neutron star, one classified as a "black widow" that got particularly hefty by gobbling up most of the mass of a stellar companion trapped in an unhappy cosmic marriage.The researchers said the neutron star, wildly spinning at 707 times per second, has a mass about 2.35 times greater than that of our sun, putting it perhaps at the maximum possible for such objects before they would collapse to form a black hole.A neutron star is the compact collapsed core of a massive star that exploded as a supernova at the end of its life cycle. The one described by the researchers is a highly magnetized type of neutron star called a pulsar that unleashes beams of electromagnetic radiation from its poles. As it spins, these beams appear from the perspective of an observer on Earth to pulse - akin to a lighthouse's rotating light.Register now for FREE unlimited access to Reuters.comOnly one other neutron star is known to spin more quickly than this one."The heavier the neutron star, the denser the material in its core," said Roger Romani, director of Stanford University's Center for Space Science and Astrophysics and a co-author of the research published this week in the Astrophysical Journal Letters."So as the heaviest neutron star known, this object presents the densest material in the observable universe. If it was any heavier it should collapse to a black hole, and then the stuff inside would be behind the event horizon, forever sealed off from any observation," Romani added.A black hole's event horizon is the point of no return beyond which anything including light gets sucked in irretrievably."Since we don't yet know how matter works at these densities, the existence of this neutron star is an important probe of these physical extremes," Romani said.The neutron star, residing in our Milky Way galaxy in the direction of the constellation Sextans and formally named PSR J0952-0607, is located roughly 20,000 light years from Earth, Romani said. A light year is the distance light travels in a year, 5.9 trillion miles (9.5 trillion km). The researchers studied it using the Keck I telescope in Hawaii.Stars that are about eight or more times the sun's mass transform hydrogen into heavier elements through thermonuclear fusion in their cores. When they build up about 1.4 times the mass of our sun in iron, that core collapses into a neutron star having a diameter only about the size of a city, with the rest blown off in the supernova explosion.Its matter is so compact that an amount about the size of a sugar cube would outweigh Mount Everest.This neutron star inhabits what is called a binary system, in an orbit with another star. The neutron star is a kind dubbed a "black widow," named in honor of female black widow spiders that eat their male partners after mating.It apparently was born with the usual mass of a neutron star, about 1.4 times that of our sun, but its gravitational pull poached material from its companion star, enabling it to grow to a mass seemingly at the uppermost limit before physics would dictate a collapse into a black hole, the densest of all known objects.Its companion star has been stripped almost bare, losing perhaps 98% of its mass to the black widow, leaving it at about 20 times the mass of our solar system's largest planet Jupiter - a far cry for its original size."It has swallowed nearly a full sun's worth of mass without yet becoming a black hole. So it should be just on the edge of black hole collapse," Romani said.Register now for FREE unlimited access to Reuters.comReporting by Will Dunham; Editing by Lisa ShumakerOur Standards: The Thomson Reuters Trust Principles.
Space Exploration
Jonathan AmosScience correspondent@BBCAmoson TwitterImage source, CEERS/UoE/Sophie Jewell/Clara PollockImage caption, The Edinburgh galaxy is the latest in a succession of "most distant" observations from WebbScottish astronomers have spied what they believe to be the most distant galaxy ever observed, using the new super space telescope, James Webb.The red smudge is 35 billion light-years away. We see it, as it was, just 235 million years after the Big Bang.It's a preliminary, or "candidate", result and will need a follow-up study for confirmation.But for the moment, the University of Edinburgh team is celebrating and marvelling at James Webb's power. "We're using a telescope that was designed to do precisely this kind of thing, and it's amazing," said Callum Donnan, an astrophysics PhD student in the university's Institute for Astronomy.James Webb is the $10bn successor to the Hubble Space Telescope and is hunting the first stars and galaxies to form in the 13.8-billion-year-old Universe.These objects are extremely faint but the new observatory has been tuned specifically to pick up their glow in infrared light. Edinburgh's high mark will almost certainly be short-lived. Since Webb began science operations at the end of June, astronomers have been finding ever more distant candidates in its imagery.And if the designed performance is achieved, scientists could eventually see objects with Webb that were in existence perhaps as little as 100 million years after the Big Bang.So we should expect a succession of announcements in the weeks and months ahead.Redshift is the term astronomers use when discussing distances in the cosmos. It's a measure that describes the way light coming from an object has been "stretched" by the expansion of the Universe to redder wavelengths.The higher the redshift number assigned to a galaxy, the more distant it is and the earlier it is being viewed in cosmic history.Recent days have seen a stream of ever larger redshift numbers being reported on the popular arxiv pre-print server.Image source, CEERSImage caption, Maisie's Galaxy was found in a wide-field survey being conducted by James WebbThe last 24 hours have been a good example of how fast things can move.The Edinburgh group pulled its target from a wide-field survey of the sky that Webb is currently conducting called the Cosmic Evolution Early Release Science (CEERS) Survey.The team that actually runs this survey put out its own far-distant candidate on Monday called CEERSJ141946.35+525632.8.Dubbed Maisie's Galaxy after an astronomer's daughter, this target has a redshift of 14.3, which means it's being seen about 280 million years after the Big Bang. Not quite as far as CEERS-93316 but still a remarkable prospect compared with the era before James Webb.There is a big caveat to all this, however. The early candidates announced from Webb observations have yet to undergo full spectroscopic examination.This process slices up the light coming from a galaxy to reveal its component colours - its spectra. These will give the clearest view of how the light, which was originally emitted at visible wavelengths, has been stretched into the infrared over the course of cosmic history.Only after this task is completed - and Webb has the instruments to do it - will distance claims move on to a surer footing.Another benefit of spectroscopy is that it will reveal the composition of objects.Theory says the very first stars were fuelled just by hydrogen, helium and a small amount of lithium - the elements created in the Big Bang. Heavier atoms - astronomers call them all "metals" - had to be forged in those pioneer stars and their descendants."We can look at the colours in our galaxy in a broad sense, and it's quite blue, which suggests a young stellar population. But it's not blue enough that this galaxy is made up of metal-free stars," Mr Donnan said.
Space Exploration
In the 2nd century CE, Greek-Egyptian astronomer Claudius Ptolemaeus (aka. Ptolemy) compiled a list of all the then-known 48 constellations. This treatise, known as the Almagest, would used by medieval European and Islamic scholars for over a thousand years to come, effectively becoming astrological and astronomical canon until the early Modern Age.One of these is the famous Centaur of classical antiquity, otherwise known as the constellation Centaurus. As one of the 48 constellation included in the Almagest, it is now one of the 88 modern constellations recognized by the IAU. Located in the southern sky, this constellation is bordered by the Antlia, Carina, Circinus, Crux, Hydra, Libra, Lupus,Musca, and Vela constellations.Name and Meaning:In classic Greco-Roman mythology, Centaurus is often associated with Chiron the Centaur – the wise half-man, half-horse who was a teacher to both Hercules and Jason and the son of the Titan king Cronus and the sea nymph Philyra. According to legend, Cronus seduced the nymph, but they were interrupted by Cronus’ wife Rhea. To evade being caught in the act, Cronus turned himself into a horse. As a result, Philyra gave birth to a hybrid son. He died a tragic death in the end, having been accidentally struck by one of Heracles’ poisoned arrows. As an immortal god, he suffered terrible pains but could not die. Zeus eventually took pity on the centaur and released him from immortality and suffering, allowing him to die, and placed him among the stars.It is believed that the constellation of Sagitta is the arrow which Chiron fired towards Aquila the Eagle to release the tortured Prometheus. The nearby constellation of Lupus the Wolf may also signify an offering of Hercules to Chiron – whom he accidentally poisoned. Just as Virgo above represents the maid placed in the sky as a sign of pity for the Centaur’s plight.History of Observation:The first recorded examples of Centaurus date back to ancient Sumeria, where the constellation was depicted as the Bison-man (MUL.GUD.ALIM). This being was depicted in one of two ways – either as a four-legged bison with a human head, or as a creature with a human head and torso attached to the rear legs of a bison or bull. In the Babylonian pantheon, he was closely associated with the Sun god Utu-Shamash.The Greek depiction of the constellation as a centaur is where its current name comes from. Centaurus is usually depicted as sacrificing an animal, represented by the constellation Lupus, to the gods on the altar represented by the Ara constellation. The centaur’s front legs are marked by two of the brightest stars in the sky, Alpha and Beta Centauri (aka. Rigil Kentaurus and Hadar), which also serve as pointers to the Southern Cross. In the 2nd century AD, Ptolemy catalogued 37 stars in the constellation and included it as one of the 48 constellations listed in the Almagest. In 1922, it was included in the 88 modern constellations recognized by the International Astronomical Union (IAU).Notable Features:Centaurus contains 11 main stars, 9 bright stars and 69 stars with Bayer/Flamsteed designations. Its brightest star – Alpha Centauri (Rigel Kentaurus) – is the Solar System’s closest neighbor. Located just 4.365 light years from Earth, this multiple star system consists of a yellow-white main sequence star that belongs to the spectral type G2V (Alpha Centauri A), and a spectral type K1V star (Alpha Centauri B).Alpha Centauri A, the brightest component in the system, is the fourth brightest individual star (behind Arcturus) in the night sky, B is the 21st individual brightest star in the sky. Taken together, however, they are brighter than Arcturus, and rank third among the brightest star system (behind Sirius and Canopus). The two stars are believed to be roughly the same age – ~4.85 billion years old – and are close in mass to our Sun.Proxima Centauri, a red dwarf system (spectral class M5Ve or M5Vie), if often considered to be a third member of this star system. Located about 0.24 light years from the binary pair (and 4.2 light years from Earth), this star system was confirmed in 2016 to be home to the closest exoplanet to Earth (Proxima b). Then there’s Beta Centauri, a blue-white giant star (spectral class B1III) located 348.83 light years from Earth that is the tenth brightest star in the sky. The star’s traditional names (Hadar or Agena), are derived from the Arabic words for “ground” and “the knee”, respectively. This multiple star system consists of Hadar A, a spectroscopic binary of two identical stars, while Hadar B orbits the primary pair with a period of at least 250 days.Next up is Theta Centauri (aka. Menkent), an orange K-type giant (spectral class K0IIIb) that is located approximately 60.9 light years from Earth. Its traditional name, which comes from its location in the constellation, translates to “shoulder of the Centaur” in Arabic.And then there’s Gamma Centauri (Muhlifain), a binary star system located 130 light years from Earth which is composed of two stars belonging to the spectral type A0. It’s name is translated from Arabic and means “two things”, or the “swearing of an oath”, which appears to be a case of name-transfer from Muliphein, a star located in the Canis Majoris constellation.The constellation is also home to many Deep Sky Objects. For instance, there is the Centaurus A galaxy, the fifth brightest galaxy in the sky and one of the closest radio galaxies to the Solar System (between 10 and 16 million light years distant). The galaxy has an apparent visual magnitude of 6.84 and is believed to contain a supermassive black hole at its center. Centaurus A’s brightness is attributed to the intense burst of star formation going on inside it, which is believed to be the result of it undergoing a collision with a spiral galaxy. Centaurus A is located at the center of the Centaurus A subgroup of the Centaurus A/M83 Group of galaxies, which includes the Southern Pinwheel Galaxy (aka. Messier 83, M83).Then there’s the famous Omega Centauri globular cluster, one of the brightest globular clusters in the Milky Way. Located approximately 15,800 light years distant, this cluster is bright enough to be visible to the naked eye. Originally listed as a star by Ptolemy in the Almagest, the cluster’s true nature was not discovered until John Herschel studied it in the early 19th century.Next up is NGC 4945, one of the brightest galaxies in the Centaurus A/M83 group, and the second brightest galaxy in the Centaurus A subgroup. The spiral galaxy is approximately 11.7 million light years distant and has an active Seyfert II nucleus, which could be due to the presence of a supermassive black hole at its center.The galaxy NGC 4650A is also located in Centaurus, some 130 million light years from Earth. This galaxy is one of only 100 polar-ring galaxies known to exist, which are so-named because their outer ring of stars and gas rotate over the poles of the galaxy. These rings are believed to have formed from the gravitational interaction of two galaxies, or from a collision with a smaller galaxy in the past. The Blue Planetary nebula (aka. the Southerner), is a bright planetary nebula in Centauru, approximately 4,900 light years distant. With an apparent visual magnitude of 8.5, it is the brightest planetary nebula in the far southern region of the sky and and can be observed in a small telescope.Finding Centaurus:Centaurus is one of the largest constellations in the night sky – covering over 1000 square degrees – and the brightest in the southern hemisphere.  For observers located at latitudes between +30° and -90°, the entire constellation is visible and the northern portion of the constellation can be spotted easily from the northern hemisphere during the month of May.For the unaided southern skies observer, the constellation of Centaurus holds a gem within its grasp – Omega Centauri (NGC 5139). But of course, this object isn’t a star – despite being listed on the catalogs as its Omega star. It’s a globular cluster, and the biggest and brightest of its kind known to the Milky Way Galaxy. Though visible to the naked eye, it is best observed through a telescope or with binoculars.This 18,300 light-year beauty contains literally millions of stars with a density so great at its center the stars are less than 0.1 light year apart. It is possible Omega Centauri may be the remains of a galaxy cannibalized by our own. Even to this present day, something continues to pull at NGC 5139’s stars… tidal force? Or an unseen black hole?Now, hop down to Alpha. Known as Rigil Kentaurus, Rigil Kent, or Toliman, is the third brightest star in the entire night sky and the closest star system to our own solar system. To the unaided eye it appears a single star, but it’s actually a binary star system. Alpha Centauri A and Alpha Centauri B are the individual stars and a distant, fainter companion is called Proxima Centauri – a red dwarf that is the nearest known star to the Sun.Oddly enough, Proxima Centauri is also a visual double, which is assumed to be associated with Centaurus AB pair. Resolution of the binary star Alpha Cen AB is too close to be seen by the naked eye, as the angular separation varies between 2 and 22 arc seconds, but during most of the orbital period, both are easily resolved in binoculars or small telescopes.Then stop for a moment to take a look at Beta Centauri. Beta Centauri is well-known in the Southern Hemisphere as the inner of the two “Pointers” to the Southern Cross. A line made from the other pointer, Alpha Centauri, through Beta Centauri leads to within a few degrees of Gacrux, the star at the top of the cross. Using Gacrux, a navigator can draw a line with Acrux to effectively determine south.But, that’s not all! Hadar is also a very nice double star, too. The blue-white giant star primary is also a spectroscopic binary, accompanied by a widely spaced companion separated from the primary by 1.3″. Or try Gamma Centauri! Muhlifain has an optical companion nearby, but check it out in the telescope… it’s really two spectral type A0 stars each of apparent magnitude +2.9!For binoculars or telescopes, hop on over to Centaurus A. This incredible radio source galaxy is one of the closest to Earth and also the fifth brightest in the sky. When seen through an average telescope, this galaxy looks like a lenticular or elliptical galaxy with a superimposed dust lane, and oddity first noted in 1847 by John Herschel.The galaxy’s strange morphology is generally recognized as the result of a merger between two smaller galaxies and photographs reveal a jet of material streaming from the galactic core. Although we cannot see it, there may be a supermassive black hole at the center of the galaxy is responsible for emissions in the X-ray and radio wavelengths!For binoculars and rich field telescopes, head towards the Crux border and center on Lambda Centauri for open cluster, IC2944. Also known on some observing lists as Caldwell 100, this scattered star cluster contains about 30 stellar members and some faint nebulosity. About 2 degrees southwest of Beta you’ll find another pair of open clusters, NGCs 5281 and 5316. Or try your hand just about a degree west of Alpha for open cluster, NGC5617. These last three are far more rich in stars and photon satisfying!Centaurus has been known to human astronomers since the Bronze Age and has gone through some changes since that time. But even after thousands of years’ time, the Centaur is still hunting in the night sky! And for those who love viewing classic constellations and bright objects, it still provides viewing opportunities that are bound to dazzle the eyes and inspire the mind!We have written many interesting articles about the constellation here at Universe Today. Here is What Are The Constellations?, What Is The Zodiac?, and Zodiac Signs And Their Dates.Be sure to check out The Messier Catalog while you’re at it!For more information, check out the IAUs list of Constellations, and the Students for the Exploration and Development of Space page on Canes Venatici and Constellation Families.Sources:Universe TodayChandra Observatory – CentaurusWindows to the Universe – Centaurus – The CentaurConstellation Guide – CentaurusWikipedia – Centaurus
Space Exploration
Astronomers have detected a strange and persistent "heartbeat" radio signal coming from a far-off galaxy.It has been classified as a fast radio burst (FRB), but where such signals are normally intensely strong emissions of radio waves of unknown origin - that typically last a few milliseconds at most - this one is different. The new signal - which appears to flash in a pattern similar to a beating heart - runs for up to three seconds, about 1,000 times longer than an average FRB.News of the discovery emerges in the same week that incredible images of a dying star and a 'cosmic dance' were revealed in an extraordinary set of NASA photos.The team detected bursts of radio waves that repeat every 0.2 seconds within this window, in a clear periodic pattern. Researchers say there are very few things in the universe known to emit these strictly periodic signals. Daniele Michilli, a postdoc at the Massachusetts Institute of Technology (MIT) Kavli Institute for Astrophysics and Space Research, explained: "Examples that we know of in our own galaxy are radio pulsars and magnetars, which rotate and produce a beamed emission similar to a lighthouse. More on Space A dying star and a 'cosmic dance': Ancient galaxies revealed in never-seen-before telescope pictures NASA reveals picture of distant universe taken by James Webb Space Telescope - but why is it a big deal? Astronauts suffer 'significant' bone loss during space missions - raising concerns for future trips to Mars "And we think this new signal could be a magnetar or pulsar on steroids."Radio pulsars and magnetars are types of neutron star - extremely dense, rapidly spinning collapsed cores of giant stars.Read more:Joe Biden discusses NASA's stunning new images taken with the James Webb Space TelescopeTwo fascinating pictures of the gas giant Jupiter have been leakedAnalysis: Why is the first picture from the world's most advanced telescope such a big deal?Called FRB 20191221A, the signal is currently the longest-lasting FRB, with the clearest periodic pattern, detected to date.Its source lies in a distant galaxy, several billion light-years from Earth.The team hopes to detect more periodic signals from this source, which could then be used as an astrophysical clock.
Space Exploration
We’ve now seen farther, deeper and more clearly into space than ever before. A stellar birthplace, a nebula surrounding a dying star, a group of closely interacting galaxies, the first spectrum of an exoplanet’s light. These are some of the first images from the James Webb Space Telescope, released in a NASA news briefing on July 12. This quartet of cosmic scenes follows on the heels of the very first image released from the telescope, a vista of thousands of distant galaxies, presented in a White House briefing on July 11.    Sign Up For the Latest from Science News Headlines and summaries of the latest Science News articles, delivered to your inbox “First of all, it’s really gorgeous. And it’s teeming with galaxies,” said JWST Operations Scientist Jane Rigby at the July 12 briefing. “That’s been true of every image we’ve taken with Webb. We can’t take [an image of] blank sky. Everywhere we look, there’s galaxies everywhere.” Going deep The galaxies captured in the first released image lie behind a cluster of galaxies about 4.6 billion light-years away. The mass from those closer galaxies distorts spacetime in such a way that objects behind the cluster are magnified, giving astronomers a way to peer more than 13 billion years into the early universe. Even with that celestial assist, other existing telescopes could never see so far.  But the James Webb Space Telescope, also known as JWST, is incredibly large — at 6.5 meters across, its mirror is nearly three times as wide as that of the Hubble Space Telescope. It also sees in the infrared wavelengths of light where distant galaxies appear. Those features give it an edge over previous observatories. “There’s a sharpness and a clarity we’ve never had,” said Rigby, of NASA’s Goddard Space Flight Center in Greenbelt, Md. “You can really zoom in and play around.” This composite of images, revealing thousands of galaxies, is the deepest view of the universe ever captured — a record astronomers don’t expect to last long.NASA, ESA, CSA, STScI Although that first image represents the deepest view of the cosmos to date, “this is not a record that will stand for very long,” astronomer Klaus Pontoppidan of the Space Telescope Science Institute in Baltimore said in a June 29 news briefing. “Scientists will very quickly beat that record and go even deeper.” But JWST wasn’t built only to peer deeper and farther back in time than ever before. The cache of first images and data showcases space scenes both near and far, glimpses of single stars and entire galaxies, and even a peek into the chemical composition of a far-off planet’s atmosphere.  “These are pictures just taken over a period of five days. Every five days, we’re getting more data,” European Space Agency science advisor Mark McCaughrean said at the July 12 briefing. (JWST is an international collaboration among NASA, ESA and the Canadian Space Agency.) “It’s a culmination of decades of work, but it’s just the beginning of decades. What we’ve seen today with these images is essentially that we’re ready now.” This Hubble Space Telescope image of the galaxy cluster SMACS 0723 shows the same spot of sky as the JWST image above. The visible galaxies are fewer and not as far away.NASA, ESA, HST/STScI/AURA Cosmic cliffs This image shows the “Cosmic Cliffs,” part of the enormous Carina nebula, a region about 7,600 light-years from Earth where many massive stars are being born. Some of the most famous Hubble Space Telescope images feature this nebula in visible light, but JWST shows it in “infrared fireworks,” Pontoppidan says. JWST’s infrared detectors can see through dust, so the nebula appears especially spangled with stars.  Newborn stars sculpt the gas and dust around them in this JWST image of the Cosmic Cliffs in the Carina nebula, a star-forming region in the Milky Way galaxy.NASA, ESA, CSA, STScI “We’re seeing brand new stars that were previously completely hidden from our view,” said NASA Goddard astrophysicist Amber Straughn. But molecules in the dust itself are glowing too. Energetic winds from baby stars in the top of the image are pushing and sculpting the wall of gas and dust that runs across the middle. “We see examples of bubbles and cavities and jets that are being blown out from newborn stars,” Straughn said. And gas and dust are the raw material for new stars — and new planets. “It reminds me that our sun and our planets, and ultimately us, were formed out of this same stuff that we see here,” Straughn said. “We humans really are connected to the universe. We’re made out of the same stuff.” This view of young stars sculpting the gas and dust around them in the Carina nebula was captured by the Hubble space telescope in 2010.NASA, ESA, Mario Livio and Hubble 20th Anniversary Team/STScI Foamy nebula  The Southern Ring nebula is an expanding cloud of gas that surrounds a dying star about 2,000 light-years from Earth. In previous Hubble images, the nebula looks like an oblong swimming pool with a fuzzy orange deck and a bright diamond, a white dwarf star, in the middle. JWST expands the view far beyond that, showing more tendrils and structures in the gas than previous telescopes could see. JWST captured an image of the Southern Ring nebula in near-infrared (left) and mid-infrared (right) light, highlighting wispy structures at the nebula’s edge and revealing a second star in the middle.NASA, ESA, CSA, STScI “You see this bubbly, almost foamy appearance,” said JWST astronomer Karl Gordon, of the Space Telescope Science Institute. In the left hand image, which captures near-infrared light from JWST’s NIRCam instrument, the foaminess traces molecular hydrogen that formed as dust expanded away from the center. The center appears blue due to hot ionized gas heated by the leftover core of the star. Rays of light escape the nebula like the sun peeking through patchy clouds. In the right-hand image, taken by the MIRI mid-infrared camera, the outer rings look blue and trace hydrocarbons forming on the surface of dust grains. The MIRI image also reveals a second star in the nebula’s core. “We knew this was a binary star, but we didn’t see much of the actual star that produced this nebula,” Gordon said. “Now in MIRI this star glows red.” Hubble took this image of the Southern Ring nebula, a cloud of gas fleeing a dying star, in 2008.NASA, The Hubble Heritage Team/STScI/AURA/NASA A galactic quintet  Stephan’s Quintet is a group of galaxies about 290 million light-years away that was discovered in 1877. Four of the galaxies are engaged in an intimate gravitational dance, with one member of the group passing through the core of the cluster. (The fifth galaxy is actually much closer to Earth and just appears in a similar spot on the sky.) JWST’s images show off more structure within the galaxies than previous observations did, revealing where stars are being born. This composite image of Stephan’s Quintet shows five galaxies in mid- and near-infrared light. Four of the galaxies are bound by each others’ gravity in an endless looping dance. The fifth, the large galaxy to the left, is in the foreground, much closer to Earth than the other four. NASA, ESA, CSA, STScI “This is a very important image and area to study,” because it shows the sort of interactions that drive the evolution of galaxies, said JWST scientist Giovanna Giardino of the European Space Agency. In an image from the MIRI instrument alone, the galaxies look like wispy skeletons reaching towards each other. Two galaxies are clearly close to merging. And in the top galaxy, evidence of a supermassive black hole comes to light. Material swirling around the black hole is heated to extremely high temperatures and glows in infrared light as it falls into the black hole. This Hubble space telescope image of the five galaxies that make up Stephan’s Quintet was released in 2018.G. Bacon, J. DePasquale, F. Summers and Z. Levay/STScI, NASA, ESA An exoplanet’s sky This “image” is clearly different from the others, but it’s no less scientifically exciting. It shows the spectrum of light from the star WASP 96 as it passes through the atmosphere of its gas giant planet, WASP 96b.  “You get a bunch of what looks like bumps and wiggles to some people but it’s actually full of information content,” said NASA exoplanet scientist Knicole Colón. “You’re actually seeing bumps and wiggles that indicate the presence of water vapor in the atmosphere of this exoplanet.” Probing a planet JWST took a spectrum of light from the star WASP 96 filtering through the atmosphere of its giant planet WASP 96b. The bumps and wiggles show how much light at various wavelengths gets absorbed by the atmosphere, revealing signs of water vapor, haze and unexpected clouds. Spectrum of exoplanet WASP 96b’s atmosphere NASA, ESA, CSA, STScINASA, ESA, CSA, STScI The planet is about half the mass of Jupiter and orbits its star every 3.4 days. Previously astronomers thought it had no clouds in its sky, but the new data from JWST show signs of clouds and haze. “There is evidence of clouds and hazes because the water features are not quite as large as we predicted,” Colón said. Gas giant planet WASP 96b, shown in this artist’s illustration, orbits its star every 3.4 days.Engine House A long time coming These first images and data have been a very long time coming. The telescope that would become JWST was first dreamed up in the 1980s, and the planning and construction suffered years of budget issues and delays (SN: 10/6/21). The telescope finally launched on December 25. It then had to unfold and assemble itself in space, travel to a gravitationally stable spot about 1.5 million kilometers from Earth, align its insectlike primary mirror made of 18 hexagonal segments and calibrate its science instruments (SN: 1/24/22). There were hundreds of possible points of failure in that process, but the telescope unfurled successfully and got to work. “We are so thrilled that it works because there’s so much at risk,” says JWST senior project scientist John Mather of NASA’s Goddard Space Flight Center. “The world has trusted us to put our billions into this and make it go, and it works. So it’s an immense relief.” The James Webb Space Telescope (illustrated) spent months unfolding and calibrating its instruments after it launched on December 25. Adriana Manrique Gutierrez/CIL/GSFC/NASA In the months following, the telescope team released teasers of imagery from calibration, which already showed hundreds of distant, never-before-seen galaxies. But the images now being released are the first full-color pictures made from the data scientists will use to start unraveling mysteries of the universe. “It sees things that I never dreamed were out there,” Mather says. For the telescope team, the relief in finally seeing the first images was palpable. “It was like, ‘Oh my god, we made it!’” says image processor Alyssa Pagan, also of Space Telescope Science Institute. “It seems impossible. It’s like the impossible happened.” In light of the expected anticipation surrounding the first batch of images, the imaging team was sworn to secrecy.  “I couldn’t even share it with my wife,” says Pontoppidan, leader of the team that produced the first color science images.   “You’re looking at the deepest image of the universe yet, and you’re the only one who’s seen that,” he says, of the first picture released July 11. “It’s profoundly lonely.” Soon, though, the team of scientists, image processors and science writers was seeing something new every day for weeks as the telescope downloaded the first images. “It’s a crazy experience,” Pontoppidan says. “Once in a lifetime.” For Pagan, the timing is perfect. “It’s a very unifying thing,” she says. “The world is so polarized right now. I think it could use something that’s a little bit more universal and connecting. It’s a good perspective, to be reminded that we’re part of something so much greater and beautiful.”  JWST is just getting started as it now begins its first round of full science operations. “There’s lots more science to be done,” Mather says. “The mysteries of the universe will not come to an end anytime soon.” Asa Stahl contributed to this story.
Space Exploration
Galaxy cluster SMACS 0723, known as Webb’s First Deep Field, features "spiky" stars and even ... [+] galaxies. NASA, ESA, CSA, STScI, Webb ERO Did you have the same question as I did when you looked at the spectacular first light images from the $10 billion James Webb Space Telescope (JWST or “Webb” for short)? Yes, they are all incredible, and a week later I’ve spent more hours than I can justify staring at, and zooming in on, Webb’s amazing images. However, I couldn't help but notice—and be increasingly distracted by—the oddness of the spiky stars in Webb’s images. Every single star has six very clear, very bright and very uniform spikes—just like a snowflake. Look more carefully and you'll see that there are two more smaller spikes on the horizontal. The brighter the star, a bigger and more distracting the spikes. Even galaxies have a spikiness in Webb's images. NASA, ESA, CSA, and STScI Although the phenomenon is only immediately noticeable with stars, look carefully at the galaxies in the hi-res version of the incredible Webb deep field image (main image, above) and you'll see that even galaxies have these spikes. What’s going on? It’s all down to the physical design of Webb—and its engineers knew the “spiky” stars would be the result. What kind of telescope is Webb? Webb is a reflecting telescope of the kind invented by Isaac Newton in the 17th century, only much, much larger. Reflecting telescopes use primary mirrors to reflect light to form an image on a secondary mirror situated in front. Webb’s primary mirror In Webb’s case its primary mirror has a diameter of 21 feet/6.5 meters and, crucially, comprises 18 gold-plated beryllium hexagonal mirror segments. Hexagonal = six-sided. It looks like this: The James Webb Space Telescope (JWST or Webb), has a primary mirror made from 18 hexagonal segments. ... [+] getty Webb’s secondary mirror See those three struts in front of the primary mirror, above? They support the secondary mirror—where the light from those 18 mirrors that comprise the primary mirror is focused. The two lower structs are angled specifically so they’re 150º from the top strut. That’s crucial for understanding the spikes. Here’s why. Webb’s two sets of ‘diffraction spikes’ Diffraction spikes from the primary mirror Webb’s physical construction causes two sets of spikes that purposefully overlap. The hexagonal shape of the 18 mirrors that make up the primary mirror were always going to cause six-sided spiky stars. That’s because light travels as a wave, and when it comes up against a boundary, it's redirected and sent off in a different direction. That’s diffraction. So the hexagonal shape of the primary mirror segments was always going to mean that all stars in Webb’s images would have six diffraction spikes. Diffraction spikes from the struts As the light gathered by those 18 hexagonal segments is reflected onto the secondary mirror sitting in front of it, it hits the struts that support it. As it does it creates a diffraction spike perpendicular to each strut, which together form six more diffraction spikes. This illustration demonstrates the science behind Webb’s diffraction spike patterns, showing how ... [+] diffraction spikes happen, the influence of the primary mirror and struts, and the contributions of each to Webb’s diffraction spikes. NASA, ESA, CSA, Leah Hustak (STScI), Joseph DePasquale (STScI) How Webb’s diffraction spikes overlap Although the diffraction spikes caused by the struts are smaller and less noticeable than those created by the primary mirror, Webb’s engineers cleverly angled the struts so that the resulting diffraction spikes would overlap. So while the struts do create six-sided diffraction spikes, four of them overlap with four previously created by the primary mirror. Only the two horizontal spikes do not overlap. The result is that all stars in the Webb telescope’s images have eight points—something that’s already a trademark that sets it apart from, say, Hubble’s images, which have four diffraction spikes. This side-by-side comparison shows observations of the Southern Ring Nebula in near-infrared light, ... [+] at left, and mid-infrared light, at right, from NASA’s Webb Telescope. NASA, ESA, CSA, STScI, and The E Why stars in Webb’s images won’t always look spiky Look at the images above of the Southern Ring Nebula. The right-hand image was taken with Webb’s Near-Infrared Camera (NIRCam) instrument while the left-hand image used the Mid-Infrared Instrument (MIRI) image. You can see that only the NIRCam image has really bright “spiky” stars. Why? “In near-infrared light, stars have more prominent diffraction spikes because they are so bright at these wavelengths,” said the Space Telescope Science Institute. “In mid-infrared light, diffraction spikes also appear around stars, but they are fainter and smaller.” For now Webb’s “spiky stars” in its NIRCam images are slightly distracting, but just like the higher resolution, more detailed images it produces, it’s something we’ll all quickly get used to seeing. Wishing you clear skies and wide eyes. Follow me on Twitter or LinkedIn. Check out my website or some of my other work here.
Space Exploration
The European Space Agency’s (ESA) Euclid space telescope launched from Cape Canaveral on July 1. By all accounts, the launch passed smoothly. Even when mission control lit up Euclid’s instruments and took the telescope’s first test images, everything seemed well. But as with any space mission — and, for that matter, any complex enough endeavor — complications inevitably arose. Euclid’s Fine Guidance Sensor (FGS) is an apparatus that helps orient the telescope. Under normal circumstances, the FGS locks onto known stars to plot where the telescope is pointing in the sky. Engineers can exhaustively test the FGS before launch, but real space conditions are very difficult to simulate. Ground tests do not necessarily account for factors like cosmic rays that interfere with the device. Mission control extended Euclid’s commissioning phase in order to write up a software update that addresses the system's anomaly, and ESA is optimistic that this update has fixed the problem. "I’m relieved to say that initial tests are looking good. We’re finding many more stars in all our tests, and while it’s too early to celebrate and more observations are needed, the signs are very encouraging," Micha Schmidt, Euclid’s operation manager, said in a statement. But that wasn’t the end of Euclid’s issues. One of Euclid’s instruments occasionally seemed to be picking up strange streaks of light. Soon, ESA mission control found the culprit: The sun. Euclid is located at the Sun-Earth Lagrange Point 2, sharing this space with NASA’s James Webb Space Telescope. Here, "behind" the Earth, the Sun remains at the telescope’s back, and the telescope lies in Earth’s shadow. To provide Euclid even more protection, the telescope has a sunshield. But the sunshield did not shadow everything. Part of one of Euclid’s thrusters reflected a tiny amount of light that seems to have evaded the sunshield. Euclid’s Visible light instrument (VIS) is sensitive enough to discern that reflected light when the instrument is turned to certain angles. In all, the stray light pops in around 10 percent of VIS’s images. How much that affects Euclid's mission remains an open question.
Space Technology
When I was a child in the 1970s, seeing a satellite pass overhead in the night sky was a rare event. Now it is commonplace: sit outside for a few minutes after dark, and you can’t miss them. Thousands of satellites have been launched into Earth orbit over the past decade or so, with tens of thousands more planned in coming years. Many of these will be in “mega-constellations” such as Starlink, which aim to cover the entire globe. These bright, shiny satellites are putting at risk our connection to the cosmos, which has been important to humans for countless millennia and has already been greatly diminished by the growth of cities and artificial lighting. They are also posing a problem for astronomers – and hence for our understanding of the universe. In new research accepted for publication in Astronomy and Astrophysics Letters, we discovered Starlink satellites are also “leaking” radio signals that interfere with radio astronomy. Even in a “radio quiet zone” in outback Western Australia, we found the satellite emissions were far brighter than any natural source in the sky. A problem for our understanding of the universe Our team at Curtin University used radio telescopes in Western Australia to examine the radio signals coming from satellites. We found expected radio transmissions at designated and licensed radio frequencies, used for communication with Earth. However, we also found signals at unexpected and unintended frequencies. We found these signals coming from many Starlink satellites. It appears the signals may originate from electronics on board the spacecraft. Why is this an issue? Radio telescopes are incredibly sensitive, to pick up faint signals from countless light-years away. Even an extremely weak radio transmitter hundreds or thousands of kilometres away from the telescope appears as bright as the most powerful cosmic radio sources we see in the sky. So these signals represent a serious source of interference. And specifically, the signals are an issue at the location where we tested them: the site in WA where construction has already begun for part of the biggest radio observatory ever conceived, the Square Kilometre Array (SKA). This project involves 16 countries, has been in progress for 30 years, and will cost billions of dollars over the next decade. Huge effort and expense has been invested in locating the SKA and other astronomy facilities a long way away from humans. But satellites present a new threat in space, which can’t be dodged. What can we do about this? It’s important to note satellite operators do not appear to be breaking any rules. The regulations around use of the radio spectrum are governed by the International Telecommunications Union, and they are complex. At this point there is no evidence Starlink operators are doing anything wrong. The radio spectrum is crucial for big business and modern life. Think mobile phones, wifi, GPS and aircraft navigation, and communications between Earth and space. However, the undoubted benefits of space-based communications – such as for globally accessible fast internet connections – are coming into conflict with our ability to see and explore the universe. (There is some irony here, as wifi in part owes its origins to radio astronomy.) Regulations evolve slowly, while the technologies driving satellite constellations like Starlink are developing at lightning speed. So regulations are not likely to protect astronomy in the near term. Read more: How many satellites are orbiting Earth? But in the course of our research, we have had a very positive engagement with SpaceX engineers who work on the Starlink satellites. It is likely that the goodwill of satellite operators, and their willingness to mitigate the generation of these signals, is the key to solving the issue. In response to earlier criticisms, SpaceX has made improvements to the amount of sunlight Starlink satellites reflect, making them one-twelfth as bright in visible light as they used to be. We estimate emissions in radio wavelengths will need to be reduced by a factor of a thousand or more to avoid significant interference with radio astronomy. We hope these improvements can be made, in order to preserve humanity’s future view of the universe, the fundamental discoveries we will make, and the future society-changing technologies (like wifi) that will emerge from those discoveries.
Space Technology
Though our sun frequently blasts out radiation in all directions and at light-like speeds, studying this star requires precise instruments onboard just a handful of spacecraft dedicated to the effort. However, scientists have now identified an innovative way to retrieve solar data — most every type of spacecraft holds a specific sensor, they say, and this mundane instrument may be hiding a wealth of important solar data. We just have to unlock the scientific treasure troves. Among the many sensors onboard spacecraft, one responsible for housekeeping data about instruments and payload health also happens to record valuable data when charged particles emanating from the sun strike the spacecraft. It's also responsible for detecting impacts from meteorites. And, according to new research, it's this data that can be used to better understand how space weather affects spacecraft. "Spacecraft are launched with instruments, payloads, and it's thought 'great — it will do science with that,' but a spacecraft is so much more," Beatriz Sanchez-Cano, who is a planetary scientist at University of Leicester in England and the lead author of the new study, said in a statement. The sensor, known as the error detection and correction (EDAC) memory counter, is essentially a piece of code that protects spacecraft computers from getting corrupted. Mainly, it protects these computers' memories by correcting data every time charged particles from the sun strike the spacecraft and lead to errors that need rectifying. But because the frequency of those error corrections spike when solar flares hit the spacecraft, scientists believe resulting data about these impacts could be used to monitor space weather. "Engineering data like this has always been vital when flying missions through deep space, but it's exciting to know decades worth of this information can also be used to build a scientific picture of the solar system," Simon Wood, who is the spacecraft operations engineer for Mars Express and an author of the new study, said in the statement. "It's why we never throw anything away — you don't know what secrets are being stored in the data beamed down from space," he added. In the new study, researchers extracted two decades worth of data recorded in the EDAC memory counters onboard seven spacecraft launched and operated by the European Space Agency (ESA). All seven crafts explore the inner solar system, including the Solar Orbiter that's been studying the sun's poles since 2020, the star-mapping Gaia spacecraft, BepiColombo, which launched in 2018 and is now on its way to Mercury, and Venus Express, which arrived at our sister planet in 2006 and lasted until 2014. Although none of these spacecraft are designed to study solar radiation per se, their EDAC memory counters recorded each time a charged particle hit the spacecraft, providing years of archival data that is difficult and expensive to collect otherwise. This includes a solar storm the sun unleashed in March 2012, which is one of the largest of its kind to strike Mars and Venus, recorded as such in instruments onboard Mars Express and Venus Express orbiters. Because of the storm, a device on the latter that helps orient it using a camera went blind for five days, the researchers write in the new study. The EDAC memory counter onboard the Mars Express spacecraft, despite being the oldest among the seven analyzed in this study, turned out to be so sensitive it felt almost every strike from solar particles. Researchers say these memory counters are onboard all spacecraft, so similar studies involving a network of spacecraft throughout the solar system could paint a more comprehensive picture of how solar particles behave. However, they caution that extracting data from memory counters is a somewhat painstaking effort, because how the sensors respond to errors is not the same for all spacecraft due to different designs. Different sensors can also exhibit different amounts of shielding and react in various ways based on specific materials installed on them. Also, as the parameters explaining the data from such sensors is not universal, "dedicated help from operations and engineering staff of each mission is needed," the authors write in the study. Nonetheless, the study itself significantly increases the resources required to understand what's been recorded by spacecraft thus far. The new paper was published in July in the journal Space Weather.
Space Technology
The night sky is peppered with light from stars, planets and other celestial bodies. But now researchers have revealed one of the brightest objects visible from Earth is a communications satellite resembling a Tetris block. Scientists say while the apparent brightness of BlueWalker 3 is not constant, its peak is on a par with Procyon and Achernar, two of the brightest stars in the night sky. “After BlueWalker 3 unfolded its 64m2 array it is visible in both dark sky and urban skies, though in urban settings, this will be limited to when Bluewalker 3 passes overhead,” said Dr Jeremy Tregloan-Reed, a co-author of the study at the Universidad de Atacama in Chile. But the appearance of the satellite is more than a mere curiosity. “Large constellations of bright artificial satellites in low Earth orbit (LEO) pose significant challenges to ground-based astronomy,” the study’s authors write. Tregloan-Reed says, among other problems, a large reflective brightness means when a satellite crosses the detector of a telescope, it leaves a streak that can be difficult, if not impossible, to remove so data in the affected pixels can be recovered. But he added space-based astronomy also faced challenges from such satellites, noting observations by the Hubble telescope had increasingly been affected by Starlink streaks because the telescope was in a LEO, sitting below that of the satellite internet constellation. While the researchers say efforts are being made by the aerospace industry, policymakers, astronomers and others to mitigate the impact of such satellites, “the trend towards the launch of increasingly larger and brighter satellites continues to grow”. BlueWalker 3, built by AST SpaceMobile, is only a predecessor to a planned constellation of satellites nicknamed BlueBirds. And there is another concern: the team note the radio frequencies used by BlueWalker 3 are close to those used for radio astronomy, raising the possibility such satellites could cause interference, making it harder for scientists to study the universe. Writing in the journal Nature, researchers describe how a team of amateur and professional astronomers from Chile, the US, Mexico, New Zealand, the Netherlands and Morocco joined forces to make observations of the night sky, and explore the impact of BlueWalker 3 – the largest commercial communications array in LEO. While BlueWalker 3 was folded when it was launched in September last year, once in space the array opened to reveal a huge surface area that reflected sunlight. However, Tregloan-Reed noted another important factor in how bright an object appears was its distance from Earth, with satellites such as BlueWalker 3 in LEO appearing much brighter than geostationary satellites. Tregloan-Reed added that even if all the reflective brightness of all satellites was reduced to below the level visible by the naked eye, there could be problems. “The sky background glow will increase due to the cumulative effect of having hundreds of thousands of satellites from various operators from many countries in low Earth orbit,” he said. The study is not the first to raise concerns over increasing brightness in the night sky as a result of such artificial objects. This year researchers called for scientists to stand up to “big light”, saying the number of low-altitude satellites should be capped to reduce light pollution and preserve the ability to study the night sky.
Space Technology
Editor’s Note: Sign up for CNN’s Wonder Theory science newsletter. Explore the universe with news on fascinating discoveries, scientific advancements and more. A revolutionary satellite that will reveal celestial objects in a new light and the “Moon Sniper” lunar lander lifted off Wednesday night. The Japanese Space Agency launch, which was rescheduled several times due to bad weather, occurred aboard an H-IIA rocket from Tanegashima Space Center at 7:42 p.m. ET Wednesday, or 8:42 a.m. Japan Standard Time on Thursday. The event streamed live on JAXA’s YouTube channel, offering a broadcast in both English and Japanese. The XRISM satellite (pronounced “crism”), also called the X-Ray Imaging and Spectroscopy Mission, is a joint mission between JAXA and NASA, along with participation from the European Space Agency and Canadian Space Agency. Along for the ride is JAXA’s SLIM, or Smart Lander for Investigating Moon. This small-scale exploration lander is designed to demonstrate a “pinpoint” landing at a specific location within 100 meters (328 feet), rather than the typical kilometer range, by relying on high-precision landing technology. The precision led to the mission’s nickname, Moon Sniper. The satellite and its two instruments will observe the universe’s hottest regions, largest structures and objects with the strongest gravity, according to NASA. XRISM will detect X-ray light, a wavelength invisible to humans. Studying stellar explosions and black holes X-rays are released by some of the most energetic objects and events in the universe, which is why astronomers want to study them. “Some of the things we hope to study with XRISM include the aftermath of stellar explosions and near-light-speed particle jets launched by supermassive black holes in the centers of galaxies,” said Richard Kelley, XRISM principal investigator at NASA’s Goddard Space Flight Center in Greenbelt, Maryland, in a statement. “But of course, we’re most excited about all the unexpected phenomena XRISM will discover as it observes our cosmos.” Compared with other wavelengths of light, X-rays are so short that they pass through the dish-shaped mirrors that observe and collect visible, infrared and ultraviolet light such as the James Webb and Hubble space telescopes. With that in mind, XRISM has thousands of curved individual nested mirrors better designed to detect X-rays. The satellite will need to calibrate for a few months once it reaches orbit. The mission is designed to operate for three years. The satellite can detect X-rays that have energies ranging from 400 to 12,000 electron volts, which is far beyond the energy of visible light at 2 to 3 electron volts, according to NASA. This range of detection will allow for studying cosmic extremes across the universe. The satellite carries two instruments called Resolve and Xtend. Resolve tracks tiny temperature shifts that help it determine the source, composition, motion and physical state of X-rays. Resolve operates at minus 459.58 degrees Fahrenheit (minus 273.10 degrees Celsius), a temperature about 50 times colder than that of deep space, thanks to a refrigerator-size container of liquid helium. This instrument will help astronomers unlock cosmic mysteries such as the chemical details of glowing hot gas inside galactic clusters. “XRISM’s Resolve instrument will let us peer into the make-up of cosmic X-ray sources to a degree that hasn’t been possible before,” Kelley said. “We anticipate many new insights about the hottest objects in the universe, which include exploding stars, black holes and galaxies powered by them, and clusters of galaxies.” Meanwhile, Xtend will provide XRISM with one of the largest fields of view on an X-ray satellite. “The spectra XRISM collects will be the most detailed we’ve ever seen for some of the phenomena we’ll observe,” said Brian Williams, NASA’s XRISM project scientist at Goddard, in a statement. “The mission will provide us with insights into some of the most difficult places to study, like the internal structures of neutron stars and near-light-speed particle jets powered by black holes in active galaxies.” Moon Sniper sets its sights on a crater Meanwhile, SLIM will use its own propulsion system to head toward the moon. The spacecraft will arrive in lunar orbit about three to four months after launch, orbit the moon for one month, and begin its descent and attempt a soft landing between four to six months after launch. If the lander is successful, the technology demonstration will also briefly study the lunar surface. Unlike other recent lander missions aiming for the lunar south pole, SLIM is targeting a site near a small lunar impact crater called Shioli, in the vicinity of the Sea of Nectar, where it will investigate the composition of rocks that may help scientists uncover the origins of the moon. The landing site is just south of the Sea of Tranquility, where Apollo 11 landed near the moon’s equator in 1969. Following the United States, the former Soviet Union and China, India became the fourth country to execute a controlled landing on the moon when its Chandrayaan-3 mission arrived August 23 near the lunar south pole. Previously, Japanese company Ispace’s Hakuto-R lunar lander fell 3 miles (4.8 kilometers) before crashing into the moon during a landing attempt in April. The SLIM probe has vision-based navigation technology. Achieving precise landings on the moon is a key target for JAXA and other space agencies. Resource-rich areas, such as the lunar south pole and its permanently shadowed regions filled with water ice, also present a number of hazards with craters and rocks. Future missions will need to be able to land within a narrow area to avoid these features. SLIM also has a lightweight design that could be favorable as agencies plan more frequent missions and explore moons around other planets such as Mars. If SLIM is successful, JAXA contends, it will transform missions from “landing where we can to landing where we want.”
Space Technology
The initial curation process for NASA’s OSIRIS-REx sample of asteroid Bennu is moving slower than anticipated, but for the best reason: the sample runneth over. The abundance of material found when the science canister lid was removed earlier this week has meant that the process of disassembling the TAGSAM (Touch-and-Go Sample Acquisition Mechanism) head – which holds the bulk of material from the asteroid – is off to a methodical start. After the collection event on Bennu three years ago, scientists expected they could find some asteroid material in the canister outside the TAGSAM head when they saw particles slowly escaping the head before it was stowed. However, the actual amount of dark particles coating the inside of the canister lid and base that surrounds the TAGSAM is even more than they’d anticipated. “The very best ‘problem’ to have is that there is so much material, it’s taking longer than we expected to collect it,” said deputy OSIRIS-REx curation lead Christopher Snead of NASA’s Johnson Space Center. “There’s a lot of abundant material outside the TAGSAM head that’s interesting in its own right. It’s really spectacular to have all that material there.” The first sample collected from outside the TAGSAM head, on the avionics deck, is now in the hands of scientists who are performing a quick-look analysis, which will provide an initial understanding of the Bennu material and what we can expect to find when the bulk sample is revealed. “We have all the microanalytical techniques that we can throw at this to really, really tear it apart, almost down to the atomic scale,” said Lindsay Keller, OSIRIS-REx sample analysis team member from Johnson. The quick-look research will utilize various instruments, including a scanning electron microscope (SEM), infrared measurements, and x-ray diffraction (XRD), to gain a better understanding of the sample. The SEM will offer a chemical and morphological analysis, while the infrared measurements should provide information on whether the sample contains hydrated minerals and organic-rich particles. The x-ray diffraction is sensitive to the different minerals in a sample and will give an inventory of the minerals and perhaps an indication of their proportions. “You’ve got really top-notch people and instruments and facilities that are going to be hitting these samples,” Keller said. This quick-look science is a tool that will offer more data to researchers as they approach the larger pieces of sample for follow-on analysis. Over the coming weeks, the curation team will move the TAGSAM head into a different specialized glovebox where they will undertake the intricate process of disassembly to ultimately reveal the bulk sample within. Rachel Barry NASA’s Johnson Space Center, Houston
Space Technology
The Blue Moon of August 2023 rises in just one week, offering skywatchers an extra treat this month with another supermoon. The Blue Moon will rise on the night of Aug. 30. Look to the east just after sunset to find it; it won't be difficult to spot given it will be the brightest and largest moon of the year. This moon is notable for a few reasons: For one, not only is it a full moon, but it's also a Blue Moon, which means it's the third full moon in a season that has four full moons, according to NASA. Plus, this Blue Moon is also a supermoon, meaning it coincides with perigee, the point in the moon's orbit when it's closest to Earth. For observers on the ground, that means it will appear slightly larger than normal, though only about 7% bigger. With the unaided eye, this size difference probably won't be noticeable. The Blue Moon of August 2023 will also be joined by a special guest in the sky: Saturn. The ringed gas giant will be just a few days past opposition, the point at which it lies directly opposite the sun as seen from Earth, making it especially bright in the night sky. As viewed from New York City, Saturn will be in the constellation Aquarius, above and to the right of the moon. From the Southern Hemisphere, however, Saturn will appear below the moon. Blue Moons occur relatively frequently, astronomically speaking, happening once every two to three years. The last Blue Moon rose in August 2021, and the next is expected to rise in August 2024. If you're hoping to catch an up-close look at this forthcoming Blue Moon, our guide to the best binoculars are a great place to start. And if you want to take an even closer look at features of the lunar surface, our guide to the best telescopes can help you find the optical gear you need. But if you're looking to snap photos of the moon or the night sky in general, check out our guides on how to photograph the moon, as well as our best cameras for astrophotography and best lenses for astrophotography. Editor's Note: If you take an awesome photo of the Blue Moon of August 2023 and would like to share it with Space.com's readers, send your photo(s), comments, and your name and location to [email protected].
Space Technology
To help combat the effects of global warming, scientists are toying with an innovative idea to shield our planet from the sun with a spaceborne "umbrella" of sorts. "In Hawaii, many use an umbrella to block the sunlight as they walk about during the day," István Szapudi, an astronomer at the University of Hawaii Institute of Astronomy, said in a statement. "I was thinking, could we do the same for Earth and thereby mitigate the impending catastrophe of climate change?" The reason carbon dioxide and other greenhouse gases contribute to global warming is that they trap sunlight around our planet that should be released back into space, ultimately leading to rising temperatures. But it's the sun, and not greenhouse gases, that creates the heat to begin with. That opens up the idea of building Earth a shade. So, Szapudi drew up an "umbrella" of his own. It would rest at the L1 Lagrange point between the sun and Earth, hypothetically joining sun- or solar-wind-observing probes such as the Solar and Heliospheric Observatory (SOHO) and Advanced Composition Explorer (ACE) that dwell there today. In theory, a large-enough solar shield could effectively block around 1.7 percent of solar radiation at L1, enough to prevent a catastrophic rise in Earth's temperatures. However, any sort of solar shade is bound to face a stark engineering challenge: At L1, they'd be subject to both the sun's and Earth’s gravities while experiencing a constant torrent of solar radiation. A viable shade would thus need to be massive — weighing millions of tons — and made of a material sturdy enough to stay in place and stay intact. Simply, we don’t have a practical way of launching that much stuff into orbit. But to get around that issue, Szapudi proposed, much of the material itself can come from space — from a captured asteroid or even lunar dust. That matter could theoretically serve as a counterweight, tethered to a much smaller shield weighing only around 35,000 tons. Right now, even such a smaller shield would be far too heavy for a rocket to lift, but with advances in materials, Szapudi’s study suggests we could manage the feat in several decades. Szapudi's apparatus falls under the, well, umbrella of solar geoengineering: the controversial idea of alleviating global warming by physically reducing the amount of sunlight that reaches Earth's surface. Other solar geoengineering ideas include pumping aerosols into the atmosphere and editing clouds to reflect more sunlight away into space. The study was published on July 31 in the journal Proceedings of the Natural Academy of Sciences.
Space Technology
We are only a few hours away from the NASA Lucy spacecraft’s first close up look at the small inner-main belt asteroid, Dinkinesh. Dinkinesh is 10 to 100 times smaller than the Jupiter Trojan asteroids that are the mission’s main targets. The Dinkinesh encounter serves as a first in-flight test of the spacecraft’s terminal tracking system. Lucy’s closest approach will occur at 12:54 p.m. EDT (16:54 UTC) at a distance within 270 miles (430 km) of Dinkinesh. However, there won’t be much time to observe the asteroid at this distance as Lucy speeds past at 10,000 mph (4.5 km/s). Two hours before closest approach, the spacecraft and the rotational platform that holds Lucy’s science instruments (the instrument pointing platform) will be commanded to move into encounter configuration. After this point, the spacecraft’s high-gain antenna will point away from the Earth and the spacecraft will not be able to return data for the remainder of the encounter. Shortly thereafter, the high-resolution grayscale camera on Lucy, L’LORRI, will begin taking a series of images every 15 minutes. (L’LORRI, short for Lucy’s Long Range Reconnaissance Imager, is supplied by the Johns Hopkins Applied Physics Laboratory.) Dinkinesh has been visible to L’LORRI as a single point of light since early September when the team began using the instrument to assist with spacecraft navigation. The team estimates that at a distance of just under 20,000 miles (30,000 km), Dinkinesh may appear to be a few pixels in size, just barely resolved by the camera. Additionally, Lucy’s thermal infrared instrument, L’TES, will begin collecting data. L’TES (formally the Lucy Thermal Emission Spectrometer, provided by Arizona State University) is not designed to observe an asteroid as small as Dinkinesh, so the team is interested to see if L’TES is able to detect the asteroid and measure its temperature during the encounter. An hour before the closest approach, the spacecraft will begin actively tracking Dinkinesh using the onboard terminal tracking system. The spacecraft will use T2Cam (the Terminal Tracking Cameras, provided by Malin Space Science Systems), to repeatedly image the asteroid. In the minutes around closest approach, this system is designed to autonomously reorient the spacecraft and its instrument pointing platform as needed to keep the asteroid centered in the cameras’ field of view. Testing this system is the primary goal of this encounter. Ten minutes before closest approach, the spacecraft is instructed to begin “closest approach imaging” with the L’LORRI instrument. In these images, taken every 15 seconds at three different exposure times, the asteroid will be several hundred pixels across, allowing the team an unprecedented view of this small main belt asteroid, which is estimated to be less than half a mile (1 km) in diameter. Lucy will wait until about six minutes before closest approach to begin taking data with its color imager (the Multi-spectral Visible Imaging Camera, MVIC) and infrared spectrometer (Linear Etalon Imaging Spectral Array, LEISA), which together comprise the L’Ralph instrument (provided by NASA’s Goddard Space Flight Center in Greenbelt, Maryland). About six minutes after the closest approach, L’Ralph will stop taking data, and Lucy will conclude the closest approach observations. By this time, the spacecraft will already be almost 1,700 miles (2,700 km) past the asteroid. Lucy will begin a maneuver referred to as a “pitchback” in which it reorients its solar arrays toward the Sun while the instrument pointing platform continues to autonomously track the asteroid as the spacecraft departs. This maneuver is designed to be carried out slowly to minimize spacecraft vibrations as the spacecraft moves its large solar arrays. L’LORRI will image Dinkinesh throughout this process to monitor spacecraft stability. Once the spacecraft is over 8,000 miles (13,000 km) from the asteroid, Lucy will stop actively tracking the position of Dinkinesh. From that point on, the team expects the asteroid to remain visible to the spacecraft’s cameras without the need to reposition the spacecraft or instruments. Two hours after closest approach, the L’TES instrument will be instructed to stop taking data. L’LORRI will continue periodically observing the asteroid for another four days to monitor the light curve of the asteroid. Once Lucy turns its high-gain antenna back toward Earth, it will be able to resume communications, with an approximately 30-minute light-travel-time delay in each direction. The team expects to receive the first signal from the spacecraft within two hours of closest approach. After assessing the health and safety of the spacecraft, the team will command the spacecraft to begin downlinking the data taken during the encounter. It will take up to a week for all data to be returned to Earth via NASA’s Deep Space Network.
Space Technology
The new era of space exploration is opening entirely new possibilities, including the tantalizing prospect of mining for resources on the Moon and asteroids. Sounds exciting—and potentially very profitable—but the reality of the situation is that space mining is completely uncharted territory. Plenty of prospecting needs to be done first to determine if these resources are even economically worth being harvested in the first place. In the next decade, NASA and its collaborators are turning their gaze back to the Moon. The agency is looking to land astronauts there in 2025 as part of the ongoing Artemis program; this would be the first time an astronaut has landed on the Moon since the final Apollo mission in 1972. This new era of exploration is also ushering in a new era of human presence and economics in space, one that may be fueled by mining for resources instead of launching them from Earth aboard rockets. The Moon is likely packed with water ice that could be repurposed into drinkable water. Asteroids are chock full of precious metals to be sold here on Earth. Yet, the future of mining in space is terra incognita, as we just begin to scratch the surface of what’s possible. Indeed, space is packed with resources that humans will need to survive while exploring and working in the dark void, and for our economies to flourish. The Moon hosts large reservoirs of water ice, which could be mined and used to make drinkable water, oxygen gas for settlements, or rocket fuel for launches off the lunar surface. There’s also helium-3, rare earth elements (REEs), and even the dusty regolith to consider. Asteroids too are concentrated sources of valuable elements like platinum, which could be harvested, shipped back to Earth, and sold to industries. At the same time, both public and private space sectors view living in space as a viable opportunity to advance humanity. The plans for space mining are, for the time being, painted in broad strokes, as space agencies and companies began laying the initial groundwork. Mining the Moon or asteroids for resources could be a huge shortcut in advancing plans for long-term habitability in space since the cost to launch anything from Earth’s surface remains incredibly high. Before any ground is broken, however, companies and government agencies will need to run an analysis of the costs associated with mining the various resources to determine if it’s economically viable to process these materials directly in space, or to transport those materials back to Earth. They may very well decide that it’s simply not worth it, at least for the time being. Harvesting these resources in the harsh environment of space could very well be a logistical nightmare that requires decades of proof-of-concept. Even so, there are decades of research and innovation that points to just how possible space mining may be, and it all began years ago with the planning for the Apollo missions. “The very first meeting in which resources from the Moon were discussed seriously, not just at a science fiction level, was in November of 1962,” Angel Abbud-Madrid, director of the Space Resources Program at Colorado School of Mines, told Gizmodo on a phone call. NASA was planning for Apollo at the time, and realizing that its astronauts will need a steady supply of oxygen, the space agency considered extracting it directly from the lunar surface, he explained. “It didn’t happen because we were there for just a couple of days, or a couple of hours, but the realization that you need the resources in-situ (i.e. directly at the site itself) has been around that long because of the extremely high cost, and high energy to launch anything from Earth,” said Abbud-Madrid. For NASA, the word “mining” doesn’t quite capture the full picture of harvesting and using resources in space, so the agency instead uses the all-encompassing phrase, “in-situ resource utilization,” or ISRU. This umbrella term not only describes the process of mining the lunar surface for materials and resources, but also the use of those raw materials to produce new products. Take, for example, ice. Lunar geologists have good reason to believe that reservoirs of water are ice tucked within soil in the Moon’s permanently shadowed regions. Those reservoirs are what NASA’s ill-fated Lunar Flashlight was set to map out. In a not too distant future, astronauts on the Moon could mine those reservoirs and melt the ice to top off their drinkable water supply. That water could also be chemically split on the Moon into oxygen and hydrogen, which could supply habitats and bases with breathable air or be used to synthesize rocket fuel and propellant. “ISRU could mean mining something and bringing it back to Earth,” Ben Bussey, chief scientist at commercial lunar lander provider Intutive Machines, told Gizmodo during a phone chat. “But it could also mean things like building infrastructure that then makes it easier to do things on the Moon.” Astronauts could also take ISRU one step further and strip metal out of the lunar soil to build infrastructure like habitats or launch equipment. Jerry Sanders, ISRU system capability lead at NASA’s Johnston Space Center, says lunar soil contains aluminum, iron, titanium, and silicon, and that those metals could then be processed out of the regolith, forged into purer forms, and used for construction. Regolith could also be a good source of oxygen, as the element is trapped within the soil’s silicate minerals. “All the regolith has somewhere around 42% to 44% oxygen by mass,” Sanders explained during a phone call. “So when we talk about processing the regolith, you get a lot of oxygen.” While astronauts aren’t going to be setting foot on the Moon until Artemis 3 launches in 2025, NASA already has early plans for ISRU operations. Sanders said that the Lunar Trailblazer satellite will continue the hunt for water ice on the Moon’s surface using an infrared spectrometer from orbit. Since infrared light is absorbed by water, scientists can use readings from the probe to potentially identify the size and distribution of these reservoirs of ice, much like Lunar Flashlight was supposed to do. Meanwhile, NASA’s VIPER—Volatiles Investigating Polar Exploration Rover—mission will drill into the lunar surface to find and analyze water ice directly. Lunar Trailblazer and VIPER are scheduled to launch in early and late 2024, respectively. Once this initial prospecting work is done, in a few decades, ISRU operations will be much larger. Related article: NASA Wants to Mine Resources From the Moon by 2032 “That far out in the future, you will be looking at large scale operations. You will have machines that will be drilling, that will be excavating, and that will be transporting material to a certain plant,” Abbud-Madrid said. “Everybody is going to need power, communication, and transport, so you’re going to have all of that infrastructure there.” While NASA is planning its own missions to explore the possibilities for ISRU, the agency is also trying to set an example that private space companies can follow. NASA has outsourced its work to private space contractors before—rockets from SpaceX deliver agency payloads to orbit and new spacesuits for the Artemis program are being designed by Axiom Space, for example. In those cases, NASA had developed some sort of engineering framework or jumping off point for space companies to follow, but ISRU is terra incognita, and Sanders says that the private space industry needs to determine if mining on the Moon is even logistically possible before companies jump on board. “Public-private partnerships and commercial involvement is becoming more and more important to succeeding and implementing [NASA’s] objectives,” Sanders said. “Before we can fully commercialize [ISRU], we need to basically help raise the whole technology portfolio such that NASA and the commercial industry feel comfortable enough to take on the job without going bankrupt.” While the Moon’s surface could be a major source for water, oxygen, and more common metals like aluminum and iron, asteroids could be a source of precious elements. Platinum and nickel, for example, are concentrated in the core of metallic asteroids. As Abbud-Madrid explained, as an asteroid grows, its gravity increases, pulling these denser elements into it. Once mined, those metals could be shipped back to Earth to be sold to various industries. With that in mind, asteroids seem like a no-brainer for mining opportunities, but NASA doesn’t currently have any immediate plans to target them. “We are currently focusing mostly on the Moon because it has the nearest term return on investment,” Sanders said. Even though the public sector is focusing on the Moon, some private space companies are forgoing it in favor of asteroids. AstroForge is a California-based asteroid mining company that raised $13 million in funding in May 2022. The company has reportedly planned a method of mining asteroids anywhere from 66 to 4,920 feet (20 to 1,500 meters) in diameter by breaking them apart in space and collecting material, as opposed to landing on the rock and mining it directly. “Platinum-group metals are used across the board—they reduce vehicle emissions, they’re used in chemotherapy drugs, and every electronic device you have has a number of these elements,” AstroForge co-founder Matt Gialich told Gizmodo during a phone interview in May 2022. “The real dream here for us is to go and utilize deep space for resources.” In January, AstroForge announced its two flights set for 2023. In April, AstroForge was expected to launch a spacecraft into orbit with a pre-loaded sample to serve as an asteroid simulant to demonstrate the company’s in-orbit extraction technology in a collaboration with OrbAstro. A spokesperson from AstroForge told Gizmodo in an email that the mission, called Brokkr-1, was “successfully launched, is alive, and is in a healthy state.” Another mission is currently scheduled for October 2023 which will see the company partner with OrbAstro, Intuitive Machines, and Dawn Aerospace to observe an asteroid target in deep space. So, will it all be worth it in the end? In short, probably—but there are a number of factors to consider. Though the Moon boasts resources that can enable extended habitation, and asteroids teem with metals that are highly valued here on Earth, a space mining industry cannot thrive without a market for these commodities. A nation that is willing to purchase the oxygen processed from lunar regolith for its settlement on the Moon, for example, will drive the demand to mine more lunar regolith. At the same time, companies and agencies interested in space mining need to do a basic cost-benefit analysis of the resources they’re interested in. If they’re too difficult to obtain and too difficult to get to a customer, then the business case to mine those resources gets weaker. “How things like prospecting and validation of a resource occurs on Earth, there’s a standard process to that. You need to find something, you need to find out if it is economically viable to extract it and use it,” Bussey said. “You can have a great source of something, but it could be too hard to get. I think that the same thing will be true on the Moon.” Assuming that space miners decide a resource is economically viable enough, and that customers are willing to pay for it, the space mining industry can establish itself and expand. That expansion could fuel a completely secondary economy. The industry will need power, mining equipment, shipping logistics, and staff, all of which could be provided by other companies that are looking for their slice of the pie—the same way people tried to cash in on the California Gold Rush. “Just like mining on Earth in the 1800s when people came to the west to look for gold and silver, there was also all this extraction,” Abbud-Marin said. “People sold shovels and picks and axes and made money out of the miners. Same thing there.” Using lunar soil for rocket fuel and selling platinum harvested from an asteroid are fantastical images that feel too far-fetched to ever be feasible, but space mining—even on a small scale—is almost certain to happen in our lifetime. The science points to plenty of resources in our cosmic backyard that have strong financial incentives behind them, but the economics of space mining, for now, are yet to be fleshed out. Even still, civilizations have been living off the land since the dawn of humanity, and as we return to long-term space habitation and exploration, living off of the Moon and asteroids represents the next frontier.
Space Technology
A commemorative time capsule was buried at the construction site of what will soon be the world's largest visible and infrared light telescope. On Oct. 13, the European Southern Observatory (ESO) celebrated its upcoming Extremely Large Telescope (ETL) by burying a time capsule that was sealed in 2017, when construction first began. The capsule is filled with tokens celebrating ESO staff and the cooperation between the observatory and Chile. It also celebrates the amazing science and technology behind the 39.3-meter telescope. "Serving as a symbolic message to future generations, it contains mementos from Chilean authorities, including a plaque from the then President of Chile, Michelle Bachelet Jeria, about opening the skies of the country to the questions of an entire planet, as well as drawings from Chilean children featuring the Universe, ESO telescopes and northern Chile landscapes," ESO officials said in a statement. Photographs of ESO staff and a copy of a book describing the future scientific goals of the ELT, which is expected to see its "first light" by 2028, were also preserved. The time capsule was buried in the wall of the ELT dome on Cerro Armazones in the Chilean Atacama Desert, where the ESO currently operates its Very Large Telescope (VLT). The time capsule was covered in an engraved hexagon that is a one-fifth-scale model of one of the ELT’s primary mirror segments. The Oct. 13 event was led by ESO Council President Linda Tacconi (Germany) and Vice-President Mirjam Lieshout-Vijverberg (The Netherlands). A few days later, on Oct. 15, ESO Council members mounted a commemorative plaque next to where the time capsule was buried. In July, the ELT reached the halfway point in its construction, with an expected completion date of 2028. While the telescope appears as only a steel structure right now, it will eventually house five separate mirrors, the largest of which will be made up of 798 individual hexagonal segments. At 39.3 meters (120 feet) wide, the ELT will be able to take in more light than current ground-based telescopes and thus provide sharper images of the cosmos, which could aid in the search for life outside Earth and reveal new insight on the nature of dark matter and dark energy. "As the largest optical and infrared telescope in the world, the ELT will shift our understanding of the universe," ESO officials said in the statement. "Its scientific goals range from the solar system to the edge of the observable universe, including exoplanets, black holes and the first stars and galaxies."
Space Technology
Huge cyberattack disables telescopes in Hawaii and ChileNational Science Foundation-funded telescopes in Chile and Hawaii have been the victim of a curious cyberattack, ceasing operations at many of them.Christopher McFadden| Aug 26, 2023 02:38 AM ESTCreated: Aug 26, 2023 02:38 AM ESTscienceTelescopes in HawaiiPaxton Tomko / Unsplash Stay ahead of your peers in technology and engineering - The BlueprintBy subscribing, you agree to our Terms of Use and Policies You may unsubscribe at any time.A major cyberattack has shut down remote connections to prominent National Science Foundation (NSF) space telescopes worldwide, Science reports. Ten telescopes have been impacted for over two weeks now, with on-site operatives able to keep some operational, albeit less efficiently. The shutdowns are causing chaos in the astronomy sphere, with many essential windows of opportunity being missed for space observations. While incredibly frustrating for researchers relying on the telescopes, experts are still none the wiser about why the telescopes were targeted. "NOIRLab is continuing its efforts to diligently investigate and resolve the 1 August cybersecurity incident that occurred in its computer systems. This incident resulted in the temporary shutdown of Gemini North and South telescopes and some of the smaller telescopes on Cerro Tololo in Chile," explained NOIRLab (the NSF-run coordinating center for ground-based astronomy) in a press release update. See Also Related A marshmallow-like planet: Astronomers discover an incredibly low-density gas giant Rubin Observatory will expand hunt for interstellar objects Astronomers examine atmosphere of extremely hot exoplanet 'WASP-76 b' "The telescopes on Kitt Peak in Arizona are unaffected. The website Gemini.edu is also currently offline. Our staff are working with cybersecurity experts to get all the impacted telescopes and our website back online as soon as possible and are encouraged by the progress made thus far," they added. Cyberattack causes chaosDue to the ongoing shutdown, research teams are collaborating to find alternatives as crucial observation windows become unattainable. As remote control of many telescopes is no longer available, some groups may now have to send graduate students to places in Chile to relieve exhausted on-site staff who have spent the past two weeks directly operating instruments, explains Science. On 1 August NOIRLab detected an attempted cyberattack on its computer systems, forcing the suspension of astronomical observations at @GeminiObs North in Hawai‘i. https://t.co/BfvAOHovvg and proposal tools are currently offline. More information here: https://t.co/BCZLAPlGd7. pic.twitter.com/Day9XLAQY2— NOIRLab Science (@NOIRLabScience) August 2, 2023“We’re all in this together,” explained Gautham Narayan, an astronomer at the University of Illinois Urbana-Champaign whose team is trying to save its chance to observe new supernovas using one of the affected Chilean telescopes. "[The astronomy community has a] grim determination to press on despite the trying circumstances,” he added. On the 1st of August, 2023, NOIRLab announced that its Gemini North telescope in Hilo, Hawaii, was hit by a cyberattack. In response to the incident, NOIRLab shut down operations at the International Gemini Observatory, home to the Hilo telescope and its twin, Gemini South, located on Cerro Pachón mountain in Chile. Thankfully, the latter was already offline for a planned outage. NOIRLab has also disconnected its computer network from the Mid-Scale Observatories (MSO) network on Cerro Tololo and Cerro Pachón in Chile. This made remote observations impossible at various telescopes, including the Víctor M. Blanco 4-meter and SOAR telescopes. As a result, NOIRLab has stopped observations at eight other affiliated telescopes in Chile.A #cyberattack on the National Optical-Infrared Astronomy Research Laboratory (#NOIRLab) research center for ground-based astronomy has left several large telescopes unable to operate for weeks.https://t.co/BdeeQPkrBt— Gustavo Cols (@GustavoCols) August 22, 2023Motive unknownNOIRLab has not released any further information about the incident, even to its employees. Cybersecurity experts are puzzled as to why the attacker targeted Gemini North. Von Welch, retired lead of the NSF Cybersecurity Center of Excellence, suggests that the attacker may not even realize they are attacking an observatory. Astronomers are motivated to enhance cybersecurity practices to secure their facilities despite the lack of information on how the Gemini North and NOIRLab systems were compromised. Narayan suggested the entire astronomical community rethink how it manages identity and access software and consider the damage that something as simple as a lost password can cause.“It doesn’t help if you build the strongest, most impenetrable fortress in the world if you forget to lock even a single door or window,” says Patrick Lin, who leads an NSF-funded space cybersecurity grant at California Polytechnic State University. “The weakest link is often with us, the humans," he added. HomeScienceAdd Interesting Engineering to your Google News feed.Add Interesting Engineering to your Google News feed.SHOW COMMENT (1) For You World's most famous frozen corpse has true appearance revealedBridging the gap between doctors and medical technologyChinese researchers find novel method to track US submarinesHow high heat affects EVs and what you can do about itThis bio-inspired leaf generates more power than solar panelsIBM unveils an analog AI chip that works like a human brainIs deep sea mining worth it?Magnetic fields reveal lost undersea worldsGalactic bubbles reveal new clues about the formation of the Milky WayRise of the Pheonix: Pentagon’s 5GAT stealth drone is back Job Board
Space Technology
Close to seven decades after the launch of humanity's first artificial satellite, Sputnik, Earth is surrounded by millions of pieces of space junk that could collide and cause significant damage to satellites, but most of them are too small to be monitored. Now a new, federally funded project aims to spot and track such dangerous pieces of tiny space debris, for which no technology exists yet. "It's not about the size, it's about the energy," Piyush Mehta, an assistant professor of mechanical and aerospace engineering at West Virginia University who is leading the new project, said in a statement. "It may be the size of a grain of salt, but, because it's traveling so quickly, it might be comparable to a truck moving at 70 miles an hour [113 kph]. You don't want to be in its path." Objects in low Earth orbit (LEO) that are bigger than a softball are currently tracked by the U.S. Air Force's Space Surveillance Network (SSN) using radar and optical sensors stationed around the world. While this technology can monitor the orbits the objects are in and use that information to predict close approaches and potential collisions, there is no currently no way to track very tiny space debris objects, which have very high destructive potential given their tremendous speeds. For example, at 250 miles (400 kilometers) up — the average altitude of the International Space Station — objects zoom around Earth at about 17,900 mph (28,800 kph). The latest project is being funded by the U.S. government's IARPA program (short for "Intelligence Advanced Research Projects Activity"). It will first detect and characterize the tiny pieces of debris, after which technologies and algorithms to track the hazardous objects may be developed, according to the same statement. To do so, researchers plan to create and work in a simulated environment that reflects the real-world system, although details are few. One way to monitor tiny pieces of space debris could be via equally small pieces of technology. Last month, Belgian company Arsec began developing a device that uses existing star trackers onboard satellites to map the paths of debris pieces. If it pans out, such a device could boost our understanding of how much minute pieces of debris clutter our local space environment. Experts say that a major collision among pieces of space debris in LEO will happen at some point, if we don't change our behavior and/or improve our tracking technology. With recent investments in projects that tackle detecting and monitoring objects in the crowded environment, some solutions to this daunting problem may be on the horizon.
Space Technology
Hundreds of Roman Empire forts popped up in old spy satellite imagery depicting regions of Syria, Iraq and nearby "fertile crescent" territories of the eastern Mediterranean. These satellites were once used for reconnaissance in the 1960s and 1970s, but their data is now declassified. Some of their archived images are now allowing for fresh archaeology finds in Earth zones often difficult for researchers to visit. The newly found 396 forts, spotted straight from space, confirm and extend an aerial survey of the region performed in 1934; this survey had recorded 116 forts on the Roman Empire's eastern frontier. Archaeologists continue to agree with the basic conclusion of that nearly century-old study, which is that Rome was fortifying its frontier — and the new study brings fresh perspective. "These forts are similar in form to many Roman forts from elsewhere in Europe and North Africa. There are many more forts in our study than elsewhere, but this may be because they are better preserved and easier to recognize," lead author Jesse Casana, a professor of anthropology focusing on the Middle East at New Hampshire's Dartmouth College, told Space.com in an email interview. "However, it could also have been a real product of intensive fort construction, especially during the second and third centuries AD." The origins of Roman Empire forts Most historians say the Roman Empire began around 27 BCE. The older Republic had been in the throes of a lengthy civil war after a group of senators assassinated the dictator Julius Caesar in 44 BCE, alleging Caesar had grown too powerful. Eventually, the Senate backed one of the rivals for Rome's leadership — Octavian, Caesar's heir — and gave the young man temporary dictatorship powers, as well as military backing. Very simply put, this resulted in Octavian overcoming his rivals. In 27 BCE, he received one-man leadership powers permanently from the Senate. Now called Augustus ("the exalted one"), his stated aim was to "restore the Republic" while consolidating his powers for himself and his successors. The line of "imperators" (emperors) continued after Augustus for centuries. But during the period when the study's newly found Roman forts were constructed – roughly spanning the second and sixth centuries CE, though other times are likely included – various difficulties were arising. Particularly in the third and fourth centuries, for example, there was no established line of emperor succession, leading to repeated assassinations and coups. The huge Roman Empire, stretching at its largest from Britain to Egypt, was also struggling to maintain its borders, in part due to sheer size and in part due to incursions from nomadic groups grappling with climate change. Following a few reorganizations, the Roman Empire was officially divided between two heirs in 395 CE, after the death of Emperor Theodosius I. The western side was gradually taken over by other peoples, while the eastern side persisted in what we now call the Byzantine Empire down to roughly the 1400s CE. That brings in how the newly found forts' functioned at the empire's edge. In a 1934 study based on flights performed in the 1920s, pioneering French archaeologist Antoine Poidebard found 116 forts in an aerial survey, the study authors stated. He suggested the fortifications were supposed to be a defensive line against Persians (more properly, the Parthians and the Sasanians, who were other superpowers of the era). But a limitation of his work is that he mainly flew his plane where he believed forts would be found. The forts, to be fair, were surveyed before the existence of modern-day archaeological standards. Casana and fellow researchers' new satellite image study was, on the other hand, able to cover more ground and counteracted Poidebard study's bias. It showed the freshly discovered 396 forts had no discernible defensive north-south pattern against eastern peoples, and were instead scattered. The new results may confirm the suspicions of some earlier scholars, who argued the 116 Poidebard forts were too far apart to form a connective line of defenses. Instead, the encampments in modern-day Syria and Iraq were possibly used to protect caravans bringing valuable goods to and from Rome's provinces, while allowing for communications and intercultural exchanges. The story of the satellites The study images came via two satellite programs originally used for surveillance during the Cold War between the United States and the Soviet Union (and their respective allies). The nations pursued military technologies (including early space missions) on "political, economic, and propaganda fronts" with minimal use of weapons, according to Encyclopedia Britannica. During this time, the "Space Race" was also in full-force, seeing both space powers rapidly accrue milestones with human and robotic space missions, such as launching the first people and sending spacecraft around the solar system. (The rivalry sometimes coalesced into moments of collaboration, however, such as the Apollo-Soyuz Test Project space mission that launched astronauts and cosmonauts together in 1975.) One of the aims of the Cold War was rapid military reconnaissance using satellites that could promptly return photographic images to Earth. The Central Intelligence Agency's Corona program, with assistance from the U.S. Air Force, imaged areas in nations such as China and the Soviet Union between 1959 and 1972. A successor program called Hexagon (also called Big Bird, KH-9 or KeyHole-9) continued surveying Soviet military zones between 1971 and 1986, led by the National Reconnaissance Office. But most importantly for the new study, these satellites were specifically built to take clear and precise images. "Because these images preserve a high-resolution, stereo perspective on a landscape that has been severely impacted by modern-day land-use changes, including urban expansion, agricultural intensification and reservoir construction, they constitute a unique resource for archaeological research," the study authors stated in their work, published Thursday (Oct. 25) in Antiquity. And it was actually the images' declassification that offers such rich data harvesting grounds for archaeologists, Casana told Space.com, as the pictures are easy to source and relatively inexpensive. "All the satellite images we used in this study are publicly available through the U.S. Geological Survey, who serve them on their EarthExplorer data distribution portal," he explained. "Images that are already scanned can be downloaded there for free, while unscanned images can be purchased for $30 USD." After downloading the images, however, came hours of processing to georeference and spatially correct the images. These processes are needed to accurately map features on the Earth's surface using GPS technology, which itself was originally used for the military as well. Archaeologists have reshared most of their work with the community via the Corona Atlas Project led by the University of Arkansas' Center for Advanced Spatial Technologies. The site even includes "a basic archaeological site database for the Middle East to help locate sites of interest," Casana said. Aside from continuing the work of Poidebard, who was cited as influential to "a long history of scholarship" in the new study, the declassified Corona and Hexagon imagery provide other benefits to archaeologists. Ancient sites are subject to many threats, Casana pointed out. The public and media focus on the damage caused by looting and the military, but archaeologists find that "destruction of sites by urban development, agricultural intensification, and dam construction are far more widespread and severe," he said. (Climate change has a role to play in these problems, too, as communities seek to protect food and water resources against a warming planet.) "The real value in historical, high-resolution imagery like Corona and Hexagon is in preserving a picture of a landscape that by and large no longer exists," Casana said, noting the spy imagery is roughly half a century old and there has been a lot of change in Iraq and Syria since then. "Our study also helps show that an unknown number of other sites were also likely lost in the time between Poidebard's flights in the 1920s and the Corona imagery of the late 1960s," he added. More broadly, the study may also add nuance to how the Romans managed their empire frontiers. Ancient Romans were famously militaristic and well-known for incursions reaching areas as far as Britain; they sometimes even fought with or allied with local tribes depending on the local commander's (or emperor's) purpose. At the same time, however, the Romans depended on trade and valued it. The researchers say their new fort study may help provide more fodder for the Roman empire's interregional links. But the new study might be subject to preservation bias, the authors warn. The density of forts seen in some areas – as well as the distribution of those forts that remain visible after all the eons – may reflect the reality that many others were lost due to "settlement and land-use practices," the authors stated. And the ground continues to change rapidly; many forts Poidebard spotted were no longer visible just a generation later, in the spy satellite images. That said, the archaeologists have found an additional 106 "fort-like features" in a subregion of the satellite study, in which future discoveries may lurk. "We are planning to expand the survey to prospect for more sites, including forts and others," Casana said. "We will work within our current survey area using additional forms of imagery, such as the more recently declassified Hexagon and U2 spy plane imagery, as well as expanding regionally into other parts of the Middle East."
Space Technology
Non-profit organization Astronomers Without Borders (AWB) brings the wonder of the solar eclipse into the palm of your hands with a new eclipse app for iOS devices. "One Eclipse" is designed to enhance the experience of solar eclipses for people of all ages and backgrounds. Developed as part of a partnership between AWB and Simulation Curriculum Corp., the app features the cutting-edge capabilities of planetarium software SkySafari 7 to give users a front-row experience of the 2023 annular solar eclipse. With the interactive eclipse map, users are able to pinpoint the perfect viewing spot for observing the upcoming eclipse and the handy countdown timer lets you see the exact moment when you can expect to experience the moon's shadow. The eclipse simulator lets you see what the eclipse will look like from any location on Earth, from start to finish. If you're as passionate about eclipses and astronomy in general, you can also share your eclipse-viewing experience with AWB's celestial storytelling community and help inspire other like-minded individuals to get involved. AWB is a non-profit organization with a membership that spans 145 countries. It is dedicated to spreading the wonder of astronomy worldwide. "Our vision is a global community that appreciates, studies, and shares the wonders of the universe, to broaden perspective, transcend borders, and improve lives." according to AWB's official website. You can help AWB with their endeavors as 50% of every purchase of "One Eclipse" goes to support AWB in their STEM outreach projects designed to help underserved communities around the world and their highly successful eclipse glasses recycling program. The annular solar eclipse on Oct. 14 will be visible across the Americas, spanning 10 countries in total. Only viewers within the path of annularity which is 118 to 137 miles (190 to 220 kilometers) wide will see the full "ring of fire" effect when the moon partially obscures the sun's disk, leaving a thin ring of light around it. If you're not able to watch the event in person you can watch the annular solar eclipse online from one of the various free livestreams. As with all solar viewing ventures remember to take necessary precautions and to NEVER look at the sun directly. To safely view this solar eclipse you must use solar filters at all times. Whether your location will experience a partial solar eclipse or an annular solar eclipse, the dangers are the same. Observers will need to wear solar eclipse glasses, and cameras, telescopes and binoculars must have solar filters placed in front of their lenses at all times. Our how to observe the sun safely guide tells you everything you need to know about safe solar observations.
Space Technology
Sentinel-1 reveals shifts from Morocco earthquake Following the devastating earthquake that struck Morocco on 8 September, satellite data have been made available through the International Charter "Space and Major Disasters" to help emergency response teams on the ground. In addition, radar measurements from Europe's Copernicus Sentinel-1 satellite mission are being used to analyze how the ground has shifted as a result of the quake, which will not only help in planning the eventual reconstruction but will also further scientific research. The powerful 6.8 magnitude earthquake struck the Atlas Mountains, about 75 km from Marrakech, in the late evening of Friday 8 September. It occurred in a region that lies along the fault lines of the European and African tectonic plates—but, nonetheless, it was a rare event for western Morocco. Unfortunately, the quake claimed thousands of lives, caused buildings and homes to collapse and blocked roads. It even caused buildings to sway as far away as the country's northern coast. On behalf of the International Federation of Red Cross and Red Crescent Societies, the United Nations Institute for Training and Research triggered the International Charter "Space and Major Disasters." Through the Charter, Earth observation assets from different space agencies are combined so that satellite images of areas struck by extreme events can be provided as fast as possible to define and map the extent of the disaster and hence help teams in their rescue efforts. Satellites routinely monitoring Earth from space and delivering data to support rapid damage mapping offer a unique tool to aid disaster management. Since a single space agency or satellite operator alone cannot meet the demands of disaster management, ESA and France's CNES space agency initiated the International Charter Space and Major Disasters in 1999. The Copernicus Emergency Mapping Service was also activated to help share satellite data in response to the Moroccan earthquake in line with the operational cooperation that is in place with the International Charter. So far, images from the French Pléiades very high-resolution satellites have been used to generate detailed damage maps over the affected areas. Pléiades is also part of ESA's Third Party Mission program. The first damage mapping products were released on 11 September, and more followed. These maps, such as that shown above, can be used by rescuers to decide on the best course of action, by identifying which roads to take and which bridges to avoid in case of collapse, for example. Philippe Bally, ESA representative of the International Charter, said, "Clearly time is of the essence when disaster strikes. Via the Charter and the Copernicus Emergency Mapping Service, it is part of our job to help ensure that the appropriate satellite data is delivered to help relief efforts." While the response to Morocco's disaster is ongoing, scientists are using measurements from the Copernicus Sentinel-1 mission in a technique known as "interferometry" to compare before-and-after views of the region. The Copernicus Sentinel-1 mission carries a radar instrument that can sense the ground and can "see" through clouds, whether day or night. Among the mission's many uses, it routinely traces subtle changes in elevation of Earth's surface. When an earthquake occurs, changes to the surface are obviously more pronounced than gradual subsidence or uplift. These radar images allow scientists to observe and analyze the exact effects that earthquakes have on the land surface. In the case of the Moroccan earthquake, Sentinel-1 data have been combined to measure surface displacement that occurred between an acquisition on 30 August and one after the earthquake on 11 September. This has led to an interferogram that shows a colorful "fringe" pattern and allows scientists to understand more about the nature of the quake and the risk of further hazards in the future. Dr. Bally explained, "Immediately after Sentinel-1 had acquired data over the earthquake area, the Earth observation processing chains available in the user community allowed information on terrain deformation to be to be retrieved. "This is the case with the interferogram generated in an automated and rapid fashion by the UK-based COMET LiCSAR, and with the Geohazard Exploitation Platform using the DIAPASON InSAR service of the CNES French space agency. "Precise deformation maps are generated for geohazard science purposes and can be used to advise disaster response teams concerning the hazard event." ESA's Director of Earth Observation Programs, Simonetta Cheli, noted, "The Charter and the Copernicus Emergency Mapping Service are extremely valuable tools to support vital relief efforts when disaster strikes. "Satellites orbiting Earth are unique in their ability to not only providing wide views of affected areas but also very detailed information as we have seen here provided by the Pléiades mission. "Since the Copernicus Sentinel-1 mission carries a radar, it can see through clouds so is also often used for mapping serious floods. In the case of the Moroccan earthquake, the mission's value has been to measure how the surface has shifted, which will be important once the immediate crisis is over, and restoration can start." Provided by European Space Agency
Space Technology
Japan's space agency has launched a rocket on September 6 at 7:42 PM EDT carrying a telescope that's more advanced than NASA's Chandra and other X-ray observatories already in orbit. The X-Ray Imaging and Spectroscopy Mission — or XRISM but pronounced as "crism" — is a mission led by JAXA (Japan Aerospace Exploration Agency) in collaboration with NASA and with contributions by the European Space Agency. Lia Corrales, a University of Michigan astronomer and mission participant, told The New York Times that XRISM represents "the next step in X-ray observations." The telescope is considered more powerful than its predecessors because of its tools. One of them, called Resolve, is a microcalorimeter spectrometer with the capability to measure tiny increases in temperature when X-rays hit its 6-by-6-pixel detector. It must operate in an environment that's a fraction of a degree above absolute zero, enabled by a multistage mechanical cooling process inside its refrigerator-sized container with liquid helium. But so long as it's working, the tool can measure each individual X-ray energy and can provide information on its source's composition, motion and physical state. The Times says the mission team expects Resolve's spectroscopic data to be 30 times sharper than what Chandra's instruments can provide. It can detect X-rays with energies that range from 400 to 12,000 electron volts, which NASA says can give us the data needed to know more about the hottest regions, the largest structures and the objects with the strongest gravity pull in the universe. XRISM's science operations won't begin until January, though, since scientists still have to switch on its instruments and tune them in the next few months. In addition to XRISM, the rocket also blasted off to space carrying the Smart Lander for Investigating Moon (SLIM) mission. The small-scale lander was nicknamed "Moon Sniper," because it was designed to demonstrate that a pinpoint landing within 100 meters of a specific target is possible. Based on the latest information from JAXA, XRISM had already separated from its rocket and had already been inserted into orbit. Meanwhile, SLIM will keep traveling for months until it reaches the moon. ✨ Great news! 🛰️— XRISM (@XRISM_jp) September 7, 2023 Solar acquisition control - ✅, Data received at Uchinoura station - ✅, Solar array paddle deployment - ✅. XRISM is on track! 🚀 #JAXA #SpaceMission #SolarPower pic.twitter.com/mxSZR5dc2j
Space Technology
Despite a rough start to its six year mission, the Euclid space telescope is ending its commissioning phase on a high after finally being able to find its guide stars again. Thanks to a clever software patch, issues with the telescope’s Fine Guidance Sensor have been resolved and it is now ready to undergo its final testing in full-on science mode, the European Space Agency recently announced. ESA’s Euclid telescope launched on July 1 to explore the mysteries of the dark universe—parts of the cosmos made up of dark energy and dark matter. The first few months of its journey, however, have not been smooth sailing. Euclid’s fine guidance sensors were occasionally losing track of guide stars, which is a way for the mission to point precisely at regions of the cosmos. The problem was caused by cosmic rays, such as high energy radiation emanating from the Sun’s solar flares, that would result in false signals appearing in Euclid’s observations. “These false signals intermittently outnumbered real stars and Euclid’s Sensor failed to resolve star patterns that it needed to navigate,” ESA wrote. The above image is a result of Euclid failing to lock into place while observing a field of stars. Instead, the dizzying image shows swirls of star trails as the telescope tried to focus on its target. The team on the ground designed a software patch to be installed on Euclid. “We carefully tested the software update step by step under real flight conditions, with realistic input from the Science Operations Centre for observation targets, and finally the go-ahead was given to re-start the Performance Verification phase,” Micha Schmidt, Euclid spacecraft operations manager, said in a statement. The Euclid telescope can now be tested in space while in science mode, an important step that will last until late November before the mission begins its real work. “Now comes the exciting phase of testing Euclid in science-like conditions, and we are looking forward to its first images showcasing how this mission will revolutionise our understanding of the dark Universe,” Carole Mundell, ESA’s director of science, said in a statement.
Space Technology
For the first time ever, a major leak in the UK of the extremely potent greenhouse gas methane has been spotted from space. Its detection by satellite raises hopes that future leaks can be stopped more quickly. Methane has 28 times the heating potential of CO2 and is a major contributor to global warming. In energy terms, the gas leaked over three months before being stopped could have powered 7,500 homes for a year. The leak from a pipeline in Cheltenham, revealed exclusively to the BBC, was discovered in March. It was detected by Leeds University with the help of specialist satellites. Methane - a greenhouse gas - is responsible for about 30% of the rise in global temperatures. Emily Dowd, PhD researcher at the School of Earth and Environment and the National Centre for Earth Observation, University of Leeds, had been using satellite imagery to assess methane leaks from landfill sites. But she noticed on the images the distinct marker of a methane leak some miles away, coming from a gas pipeline owned by Wales and West Utilities. Methane has a global warming potential 28 times greater than carbon dioxide over 100 years. Identifying and tackling methane emissions is a crucial objective of the UK and other countries seeking to tackle climate change. Upon discovering the leak Ms Dowd worked with GHGSat - whose satellites provided the original images - to take further surveys from space, while a team from Royal Holloway University made on-the-ground round measurements. Ms Dowd said: "Finding this leak brings a question of how many there are out there and maybe we need to be looking a bit harder to find them and take advantage of the technology we have." Wales and West Utilities said they became aware of the leak after a member of public reported the smell of gas. They said were in the process of obtaining the necessary permissions for replacing the gas mains when the leak was picked up by satellite. But the satellite detection process has shown the potential of picking up methane leaks quickly. The main sources of methane are the oil and gas industry, farming and landfill sites. UK methane emissions have fallen significantly since 1990 but in recent years progress has slowed. Currently methane leaks are detected through routine on-the-ground surveys - a very challenging prospect when there are thousands of miles of pipes and sites. And the UK's methane emissions are only an estimate gleaned from economic activity data. Jean-Francois Gauthier, senior vice-president for strategy at GHGSat told the BBC: "It's important to highlight that satellites are just one piece of the puzzle. But satellites have a very unique value ... that they can come back [and collect more images] very frequently and they can do so without the need to deploy people on the ground so they can do so effectively and also affordably." The company has nine satellites in their constellation, which orbit at 500km overhead, and are some of the highest resolution devices able to see gases at 25m resolution. The company has recently signed a £5.5m partnership with the UK - funded by the UK Space Agency - to provide satellite data on methane emissions to UK organisations such as Ordnance Survey. The UK Space Agency's CEO, Dr Paul Bate, said: "Satellites are getting smaller and more powerful, giving us an ideal vantage point from which to monitor global greenhouse gas emissions and inform decision-making on the path to Net Zero." There are still limitations with the satellites that will need to be developed. Prof Grant Allen, lecturer in atmospheric science at the University of Manchester, told the BBC: "There is still some work to do to fully validate the precise magnitude of such emissions estimated by satellites like GHGSat, but the capability is already proving super useful for identifying where big (preventable) sources may be."
Space Technology
Space needs better 'parking spots' to stay usable, and an engineer is finding them Any mission headed to space needs a "parking spot" at its destination. But these parking spots, regions located on orbits, are quickly becoming occupied or more vulnerable to collisions. Most objects launching to space are satellites, which can travel faster than 4 miles per second in the regions where they park. About 10 times the number of satellites currently in space are expected to launch by 2030. Simultaneously, satellite constellations are increasing in number and size. These are groups of satellites working together as a system, such as for enabling GPS, observation of Earth, internet access and other types of communications. "With this density of satellites, something is going to fail and cause a collision. It's just a matter of probability," said David Arnas, an assistant professor of aeronautics and astronautics in Purdue University's College of Engineering. "Satellite constellations are getting so big and numerous that it's becoming impossible to accurately track them all and ensure their long-term safety even through computational means." Arnas and his graduate students are investigating how orbits could be used to design better parking spots for satellites both in areas closer to Earth, where many of these available locations have already been taken, and in other parts of space that will soon see an increase in satellite population, such as the large area between Earth and the moon called the cislunar region. His research group also is coming up with new methods for feasibly analyzing satellite constellations as they increase in size. Arnas' goal is to make space more equitable. Putting spacecraft in designated parking spots instead of just anywhere could reduce the likelihood of space becoming too cluttered for missions to safely take place. "Space is a common resource of humanity, just like water and air. Even if it seems very vast, it is still limited. It is our responsibility to ensure that future generations will also have fair access to it," he said. Helping satellite constellations get bigger more safely No matter whether satellite constellations are located closer to Earth or eventually near the moon, space debris is an unavoidable issue. Within just one month, pieces of debris from a satellite explosion or collision in low Earth orbit can cover the whole Earth. This debris could stick around for anywhere from a few years to several hundred years, depending on the altitude. If low Earth orbit becomes more crowded, satellites will have few places where they can quickly get out of the way of debris before getting hit. This presents a mess of a math problem. But Arnas and his students are identifying how to organize large satellite constellations so that it's feasible to predict how they should reconfigure when a massive debris cloud is headed their way. "If we have a lot of satellites in an area where there's been a fragmentation event, we will have to move these satellites. This means that we have to optimize not only the final positions of the satellites, but also the maneuvers that each satellite would have to perform in a very short period of time. And right now, that's not possible to do if several large constellations are involved," he said. "However, if you have a general structure, a distribution containing all satellites in the region, it's not only possible, but something that we can do even with pen and paper. We can foresee the possibilities of reconfiguration and react very quickly if something unexpected happens." Arnas has made findings about how to estimate orbital capacity, reduce the risk of collisions within satellite constellations, and design satellite orbits that are more resilient to disturbances. One method he developed would help to calculate the minimum distance that satellites should maintain from each other so that no matter what happens in a particular orbit, each satellite would be far enough away to avoid a collision. He's also proposed a new way to analyze large satellite constellations in subsets so that they are easier to study. Currently, there are few policies regulating where satellites can be put in space. Through the tools he's creating, Arnas hopes to help inform decision-makers on what the consequences could be for launching a new satellite or establishing a new constellation. "I want to give policymakers a way to know how approving a mission is going to affect the future capacity and sustainability of the space sector," he said. Making travel between Earth and the moon more fuel-efficient The increase in space missions and satellite density doesn't just affect spacecraft orbiting close to Earth. Dozens of missions may be traveling through the cislunar region over the next few years, but it's hard to chart the trajectories spacecraft should take for each individual mission. Solar radiation and the combined gravitational pull of the Earth, moon and other planets have a large effect on orbits and how they're used. To help solve this issue, Arnas' research group is exploring how so-called resonant orbits could be used to design these trajectories and help spacecraft save fuel when traveling the 238,900 miles from Earth to the moon. Arnas and Purdue graduate student Andrew Binder are building on an idea NASA explored in the past to propel satellites from low Earth orbit without expending fuel by using very long cable structures called "tethers." Applying this idea to the cislunar region, Arnas and Binder envision building a reusable infrastructure in space based on a pair of tethers that could "catch and throw" satellites between Earth and the moon. One tether would be in orbit around Earth and the other would orbit the moon. The tethers would provide the necessary impulse for satellites to cross cislunar space so that they won't have to use up fuel to perform that trip. Although their findings are preliminary, Arnas and Binder are developing more complex models of this tether system that they hope could help lead to a more streamlined way to travel through cislunar space. "If missions to the moon and back are going to become more common, then it could be very useful to have an infrastructure already built in orbit to transition payloads in the cislunar system," Arnas said. Provided by Purdue University
Space Technology
Shortly after launching on 1 July, the European space observatory Euclid started performing tiny, unexpected pirouettes. The problem revealed itself during initial tests of the telescope’s automated pointing system. If left unfixed, it could have severely affected Euclid’s science mission and led to gaps in its map of the Universe. Now the European Space Agency (ESA) says that it has resolved the issue by updating some of the telescope’s software. The problem occurred when the on-board pointing system mistook cosmic noise for faint stars in dark patches of sky, and directed the spacecraft to reorient itself while capturing a shot. Giuseppe Racca, Euclid project manager at ESA in Noordwijk, the Netherlands, says that the updated pointing system will operate slightly slower than planned. As a result, the main mission, due to last six years, could take up to six months longer. Its scientific goals should not be affected, ESA says. Mapping the Universe Euclid is designed to carry out a deep survey of the Universe by mapping the positions of 1.5 billion galaxies in 3D, looking beyond the stars in the Milky Way. But to do so, it will often have to photograph some of the darkest patches of the sky, which have only very faint stars. Euclid must use the known positions of those stars — as previously mapped by another ESA mission, Gaia — to find the correct patch and continuously adjust its position to extremely high precision for more than 10 minutes at a time. Initial tests of this system showed that, in some cases, the telescope was not pointing stably. Instead, it would wobble, producing test images in which some stars appeared to follow tiny looping trails. ESA says that the Euclid team, together with its principal industrial contractor, Thales Alenia Space, was able to diagnose the problem quickly. The pointing system uses auxiliary sensors inside the telescope to take periodic 2-second exposures of the field of view. It then matches the stars it sees with those in the Gaia catalogue, to make sure they are in the expected positions. But the sensors also pick up noise from energetic particles such as cosmic rays, which continuously rain onto the probe from all directions, explains Giovanni Bosco, a physicist at Thales Alenia Space in Turin, Italy. Within 100 milliseconds, the on-board software has to filter that noise and single out the real stars. This didn’t always work out as planned, says Racca. “Sometimes it had too few stars, and it was getting confused. It was losing the guiding stars and then automatically started to look for them again.” Bosco worked with the team at subcontractor Leonardo in Florence, Italy, to fix the problem by improving how the algorithms filter out cosmic noise. ESA has now tested the system and announced on 5 October that it is working as planned. Rogue light Another issue spotted in early imaging tests was that tiny amounts of stray light seemed to be entering the telescope — despite it being protected by a sunshield and wrapped in multiple layers of insulation. The problem was probably caused by a thruster that sticks out to one side of the spacecraft, where it is not protected by the sunshield, says Racca. When the telescope was oriented at certain angles, sunlight was ricocheting off a 1-square-centimetre area on the thruster — the only part of it that is not painted black — and bouncing from the back of the sunshield onto the side of the telescope. A small fraction of this light could be detected by Euclid’s super-sensitive cameras. The mission team found that the problem went away after simply adjusting the orientation of the probe by 2.5 degrees. Racca says that the mission can now resume its planned commissioning stages, and expects that it will be able to begin its scientific work some time in November. “When I heard about the problems and the solutions they were trying out, to me it sounded like this will work out,” says Anthony Brown, an astronomer at Leiden University in the Netherlands and a senior member of the Gaia science team. Still, he adds, whenever problems with a space mission can be overcome, “it’s always an immense relief.” This article is reproduced with permission and was first published on October 6, 2023.
Space Technology
NASA's Europa probe gets a hotline to Earth NASA's Europa Clipper is designed to seek out conditions suitable for life on an ice-covered moon of Jupiter. On Aug. 14, the spacecraft received a piece of hardware central to that quest: the massive dish-shaped high-gain antenna. Stretching 10 feet (3 meters) across the spacecraft's body, the high-gain antenna is the largest and most prominent of a suite of antennas on Europa Clipper. The spacecraft will need it as it investigates the ice-cloaked moon that it's named after, Europa, some 444 million miles (715 million kilometers) from Earth. A major mission goal is to learn more about the moon's subsurface ocean, which might harbor a habitable environment. Once the spacecraft reaches Jupiter, the antenna's radio beam will be narrowly directed toward Earth. Creating that narrow, concentrated beam is what high-gain antennas are all about. The name refers to the antenna's ability to focus power, allowing the spacecraft to transmit high-powered signals back to NASA's Deep Space Network on Earth. That will mean a torrent of science data at a high rate of transmission. The precision-engineered dish was attached to the spacecraft in carefully choreographed stages over the course of several hours in a Spacecraft Assembly Facility bay at NASA's Jet Propulsion Laboratory in Southern California. "The antenna has successfully completed all of its stand-alone testing," said Matthew Bray a few days before the antenna was installed. "As the spacecraft completes its final testing, radio signals will be looped back through the antenna via a special cap, verifying that the telecom signal paths are functional." Based at the Johns Hopkins University Applied Physics Laboratory in Laurel, Maryland, Bray is the designer and lead engineer for the high-gain antenna, on which he began working in 2014. It's been quite a journey for Bray, and for the antenna. Just over the past year, he's seen the antenna crisscross the country in the lead-up to the installation. Its ability to beam data precisely was tested twice in 2022 at NASA's Langley Research Center in Hampton, Virginia. Between those two visits, the antenna made a stop at NASA's Goddard Space Flight Center in Greenbelt, Maryland, for vibration and thermal vacuum testing to see if it could handle the shaking of launch and the extreme temperatures of outer space. Then it was on to JPL in October 2022 for installation on the spacecraft in preparation for shipment next year to NASA's Kennedy Space Center in Florida. The long journey to Jupiter begins with a launch from Kennedy Space Center in October 2024. Europa in their sights "The high-gain antenna is a critical piece in the buildup of Europa Clipper," said Jordan Evans, the Clipper project manager at JPL. "It represents a very visible piece of hardware that provides the capability that the spacecraft needs to send the science data back from Europa. Not only does it look like a spacecraft now that it has the big antenna, but it's ready for its upcoming critical tests as we progress towards launch." The spacecraft will train nine science instruments on Europa, all producing large amounts of rich data: high-resolution color and stereo images to study its geology and surface; thermal images in infrared light to find warmer areas where water could be near the surface; reflected infrared light to map ices, salts, and organics; and ultraviolet light readings to help determine the makeup of atmospheric gases and surface materials. Clipper will bounce ice-penetrating radar off the subsurface ocean to determine its depth, as well as the thickness of the ice crust above it. A magnetometer will measure the moon's magnetic field to confirm the deep ocean's existence and the thickness of the ice. The high-gain antenna will stream most of that data back to Earth over the course of 33 to 52 minutes. The strength of the signal and the amount of data it can send at one time will be far greater than that of NASA's Galileo probe, which ended its eight-year Jupiter mission in 2003. On site at JPL for the antenna installation was Simmie Berman, the radio frequency module manager at APL. Like Bray, she began her work on the antenna in 2014. The radio frequency module includes the spacecraft's entire telecommunications subsystem and a total of seven antennas, the high-gain among them. Her job during installation was to ensure the antenna was properly mounted to the spacecraft and that the components were correctly oriented and well integrated. While the engineers at both APL and JPL have practiced the installation many times, virtually and with real-world mock-ups, Aug. 14 was the first time the high-gain antenna was attached to the spacecraft. "I've never worked on anything of this magnitude, in terms of physical size and also in terms of just general interest," she said. "Little kids know where Jupiter is. They know what Europa looks like. It's supercool to get to work on something that has the potential for such a big impact, in terms of knowledge, for humanity." After completing this major milestone, Europa Clipper still has a few more steps and a few more tests ahead as it is prepared for its trip to the outer solar system. Provided by NASA
Space Technology
Meteosat Third Generation EUMETSAT’s next generation of geostationary satellites. Earth’s (cloudy) beauty revealed like never before The first image from Meteosat Third Generation – Imager 1 (MTG-I1) reveals a level of detail about the weather over Europe and Africa not previously possible from 36,000km above the Earth. The higher-resolution images provided by the instruments on board give weather forecasters more information about the clouds cloaking much of Europe and visible in the equatorial region of Africa and the Atlantic Ocean. Sand and sediment in the waters off Italy are also visible, as well as dust or smog being carried from south Asia. This degree of detail is not possible from the instruments on the Meteosat Second Generation satellites. The image was captured at 11:50 UTC on 18 March 2023 by the Flexible Combined Imager on MTG-I1. By comparison, this image, taken at 11:45 UTC on 18 March 2023 by the SEVIRI instrument on Meteosat-11 (a Meteosat Second Generation satellite), provides noticeably less information than MTG-I1’s. The SEVIRI instrument has fewer channels than the Flexible Combined Imager, and provides an image of Europe and Africa every 15 minutes, compared to every 10 minutes from the MTG satellite. The bluish cast to areas of snow and high ice clouds is due to the SEVIRI instrument having fewer channels. The processed data from SEVIRI is less able to match the colours seen by the human eye. A “zoom in” on Europe taken from MTG-I1’s first image shows beautiful cloud features such as the “cloud streets” over the Greek Islands created by wind, the so-called Von Karman vortices – also created by wind – downstream of the Canary Islands, and cumulus cloud fields over the Libyan coast. The snow on the Alps and in Norway is also visible in greater detail than is possible to ascertain from the imaging instruments on Meteosat Second Generation satellites. The extra channels of MTG-I1’s Flexible Combined Imager, 16 compared to 12 on the MSG SEVIRI instruments, not only produce better true colour imagery, but will also be useful to better detect dust, haze, smoke, cloud properties and wildfires, among others. The same “zoomed in” image, taken by the imager on a Meteosat Second Generation satellite, clearly shows how the new technology on MTG-I1 provides weather forecasters with more crucial information about the state of the weather. This stunning view of Europe, taken from MTG-I1’s first image at 11:50 UTC on 18 March 2023, shows the benefit of the extra channels of the satellite’s Flexible Combined Imager instrument. Greater detail about cloud structures, coupled with the fact imagery will be produced more frequently, will enable weather forecasters to monitor and predict rapidly developing severe weather events more accurately and quickly. By comparison, the same view provided by a Meteosat Second Generation instrument at 11:45 UTC on 18 March 2023, shows fewer details, particularly in relation to clouds over Nordic countries. The higher resolution of the imaging instrument on MTG-I1 provides information that MSG satellites could not deliver. The information is crucial for understanding the weather and climate. Visible in this “zoomed in” view of the Mediterranean is turbidity in coastal waters and snow on the Alps, the Apennine Mountains and the Dinaric Alps, as well as more detail about the cloud structures in the image. Some of these features are not visible in the same view taken at almost the same time by the imager on the MSG satellite, or not visible with the same amount of detail. The image was taken by the Flexible Combined Imager on MTG-I1 at 11:50 UTC on 18 March 2023. The same view, taken by the SEVIRI imager on Meteosat-11, a Meteosat Second Generation satellite, at 11:45 UTC on 18 March 2023, does not detect turbidity of coastal waters or the same degree of detail about the snow and clouds visible in the image from MTG-I1. To download all the images and animations above in high resolution please visit our image database. This colourful and spectacular animation of convective storms developing in the Gulf of Guinea demonstrates the value of Europe’s news and most advanced meteorological satellite, MTG-I1. The imagery was produced from two channels (IR10.5 and VIS0.6 micron channels) of MTG-I1’s Flexible Combined Imager instrument on 19 March from 05:15 UTC. The colours represent the temperatures of the cloud tops, with dark red shades being the coldest. Combining the two channels allows forecasters to see the thermal characteristics of the clouds and the three-dimensional structure of the storm tops. This beautiful animation shows so-called von Kármán vortices over the Canary Islands. Named for the brilliant Hungarian-American mathematician, physicist and aerospace engineer Theodore von Kármán, the vortex “street” is created by the flow of air over obstacles, in this case, the Canary Islands, when a cloud layer is present at a certain altitude. The vortex street consists of cumulus clouds which are nicely seen in this animation produced in true colour from the Flexible Combined Imager on MTG-I1 from 11:50 UTC on 18 March to 11:50 19 March 2023. The value of EUMETSAT’s newest and most-advanced meteorological satellite, MTG-I1, for monitoring and predicting storms is immense, but its value doesn’t end there! MTG-I1’s Flexible Combined Imager is also invaluable for monitoring wildfires, as this animation shows. In it, fires in central Africa and their smoke plumes can be seen with more detail about their scale and intensity than has been possible from geostationary orbit up until now. The animation was created from imagery taken between 11:50 UTC on 18 March 2023 and 11:50 UTC 19 March 2023.
Space Technology
As NASA’s Lucy spacecraft zoomed past its first asteroid on Wednesday, its mission scientists made a surprising discovery of a bonus space rock. The first images sent over by the Lucy mission of its brief rendezvous with the main asteroid belt revealed a binary pair, with a tiny satellite closely orbiting Dinkinesh, NASA announced on Thursday. Dinkinesh, which roughly translates to “marvelous” in Amharic, is around 0.5 miles (790 meters) at its widest, while its smaller companion is about 0.15 miles (220 meters) in size. “Dinkinesh really did live up to its name; this is marvelous,” Hal Levison, principal investigator for Lucy at the Southwest Research Institute, said in a statement. The asteroid was added to Lucy’s itinerary earlier this year in order to test the spacecraft’s instruments before it reaches the Jovian system and begins exploring Jupiter’s Trojan asteroids. “This is an awesome series of images. They indicate that the terminal tracking system worked as intended, even when the universe presented us with a more difficult target than we expected,” Tom Kennedy, guidance and navigation engineer at Lockheed Martin, said in a statement. “It’s one thing to simulate, test, and practice. It’s another thing entirely to see it actually happen.” Scientists had suspected that Dinkinesh may be a binary system, and the asteroid proudly showed off its small moon in the images downlinked by the Lucy spacecraft. “We knew this was going to be the smallest main belt asteroid ever seen up close,” Keith Noll, Lucy project scientist from NASA’s Goddard Space Flight Center, said in the statement. “The fact that it is two makes it even more exciting.” The close flyby lasted for roughly eight minutes, during which Lucy collected data on the asteroid using its color imager and infrared spectrometer that comprise the L’Ralph instrument. Although it was originally meant to be an engineering test for the spacecraft, the images were a pleasant surprise for the scientists on the mission who are currently pouring over the data. It will take around a week for the spacecraft to transmit all the data from the encounter, which the mission team will use to evaluate Lucy’s behavior during the close encounter as the mission prepares for its next flyby of the main belt asteroid Donaldjohanson in 2025. The Lucy mission launched in October 2021 with the aim of studying the Trojan asteroids, a group of rocky bodies that lead and follow Jupiter as it orbits the Sun. Lucy will begin its tour of the Trojan asteroids in 2027 by visiting Eurybates and its binary partner Queta, followed by Polymele and its binary partner, Leucus, Orus, and the binary pair Patroclus and Menoetius. “When Lucy was originally selected for flight, we planned to fly by seven asteroids. With the addition of Dinkinesh, two Trojan moons, and now this satellite, we’ve turned it up to 11,” Levison said.
Space Technology
Saturn makes for an excellent skywatching target this weekend, thanks to its current position in the solar system. The ringed gas giant will reach opposition, meaning it will be situated directly opposite from the sun with Earth in the middle. Around the same time, Saturn will reach perigee, its closest approach to Earth, according to In-The-Sky. The combination of these two celestial events means Saturn will appear at its biggest and brightest this weekend. The planet should remain visible through February 2024. For skywatchers in North America, look to the east-southeast just after sunset to find Saturn in the Aquarius constellation. Within a few hours, the ringed planet should be fairly high in the sky. It will reach its highest point around midnight local time on Sunday (Aug. 27), while the exact moment of opposition will occur a few hours later, around 4:20 a.m. EDT (0820 GMT) on Sunday. At the moment of opposition, Saturn will reach magnitude 0.4, its brightest for 2023. (Brighter objects have a lower magnitude; the full moon, by comparison, has a magnitude of around -12.6, according to NASA.) This means Saturn should be easily visible to the unaided eye as a bright, non-flickering orb in the sky. However, viewing Saturn through binoculars should reveal more detail and bring out the pale yellow color of the planet. Under the right conditions, some high-power binoculars could even begin to bring out faint traces of Saturn's rings or even its largest moon, Titan. Through a telescope, however, Saturn's rings should be clearly apparent. The gas giant's rings are currently beginning to tilt more on-edge toward Earth, and will continue to do so through 2025, according to NASA. That makes this weekend an optimal time to catch a glimpse of one of the best night sky targets available to most backyard skywatchers. And if you're looking to take photos of these celestial objects or the night sky in general, check out our guide on how to photograph the moon or how to photograph the planets, as well as our best cameras for astrophotography and best lenses for astrophotography. Editor's Note: If you get an awesome picture of Saturn at opposition and would like to share it with Space.com's readers, send your photo(s), comments, and your name and location to [email protected].
Space Technology
On some nights, one of the brightest objects in the sky is neither a planet nor a star. It is a telecommunications satellite called BlueWalker 3, and at times it outshines 99% of the stars visible from a dark location on Earth, according to observations reported today in Nature1. BlueWalker 3 is the most brilliant recent addition to a sky that is already swarming with satellites. The spaceflight company SpaceX alone has launched more than 5,000 satellites into orbit, and companies around the globe have collectively proposed launching more than half a million satellites in the coming years — a scenario that astronomers fear could hamper scientific observations of the Universe. The study “shows us that there are no boundaries to satellite brightness”, says Patrick Seitzer, an emeritus astronomer at the University of Michigan, Ann Arbor, who was not involved in the study. “I’m concerned that we’re going to see a very large number of large satellites launched in the next decade, and it will change the appearance of the night sky forever.” Twilight star Telecommunications firm AST SpaceMobile in Midland, Texas, launched BlueWalker 3 on 10 September 2022 as a prototype for a satellite fleet designed to make mobile broadband available almost anywhere. The satellite’s huge array of antennas and white colour mean that it reflects a considerable amount of sunlight back towards Earth, making it shine even at twilight. To quantify its effects, professional and amateur astronomers embarked on an international observation campaign, ultimately spotting the satellite from locations in Chile, the United States, Mexico, New Zealand, the Netherlands and Morocco. The researchers assessed the satellite’s shine using a standard astronomical index called the magnitude scale, on which the brightest objects have the smallest numbers. The brilliant Venus, for example, can reach a magnitude of –4.6, whereas the North Star is much dimmer, at magnitude +2. That is roughly the magnitude limit visible from a city with the naked eye. On 10 November 2022, the satellite unfurled its array of antennas, causing it to brighten to magnitude +0.4. If it were a star, it would have been one of the ten brightest in the sky. But its apparent brightness changes as the satellite rotates, and by late December, it had dimmed to a magnitude of +6. It then brightened again, reaching magnitude +0.4 once more on 3 April 2023. The International Astronomical Union, a group of professional astronomers, recommends that artificial satellites in low-Earth orbit have a maximum brightness of magnitude +7. BlueWalker 3 can be hundreds of times brighter, the authors found. And AST SpaceMobile says it plans to provide broadband coverage with a fleet of 90 similar satellites, including 5 that are scheduled to launch in early 2024. Breakaway debris Moreover, the team observed a bright object separating from the main satellite during deployment, and later learnt that this was the container that protected the folded antennas during ascent, before being jettisoned into space. It, too, was relatively bright at magnitude +5.5. In a statement to Nature, AST SpaceMobile said that it is currently working with NASA and astronomy groups to address these concerns. Many astronomers were caught by surprise in mid-2019, when SpaceX successfully launched 60 satellites, creating a ‘train of stars’ that glided through the night sky. Now, low-Earth orbit is littered with thousands of commercial satellites. If captured by a telescope during a long exposure, such objects can leave a bright streak that renders the data unreadable. Astronomers have long steered their telescopes to avoid the brightest of these objects. That workaround will still be possible if AST SpaceMobile launches a fleet of satellites similar to BlueWalker 3, says Jonathan McDowell, an astronomer at the Center for Astrophysics | Harvard & Smithsonian in Cambridge, Massachusetts, who was not involved in the study. The bigger concern, he says, is that other companies might also launch constellations of large satellites. If that happens, Seitzer says, “then the night sky will be irreversibly changed”. A search for solutions To avoid such a scenario, astronomers are working to find mutual solutions. Some of the study’s authors, for example, are a part of a newly formed coalition called CPS that aims to tackle the issue and has been in contact with companies including SpaceX and AST SpaceMobile. SpaceX is already trying methods to make its satellites less visible, and coalition members say that AST SpaceMobile also seems amenable to dimming its satellites. The company says it is planning to use anti-reflective materials on its next-generation satellites, as well as certain flight manoeuvres to reduce the crafts’ apparent magnitude. “They left us with a very good impression that they would work more with us,” says coalition member Constance Walker, an astronomer at the National Science Foundation’s NOIRLab in Tucson, Arizona, and also a study author. Such discussions are the way forwards, she says. “No one is going to return to yesterday, when we had darker skies.”
Space Technology
Satnav for the Moon could benefit from Fibonacci’s expertise Middle Ages maths to the rescue Future satellite navigation systems intended for Earth's Moon may be aided by a model of it developed with methods that go back to mathematician Fibonacci, who lived 800 years ago. With increasing interest in missions to return people to the Moon and even establish some sort of permanent outpost on Earth’s satellite, it seems that modern successors to the lunar vehicles of the Apollo missions may be assisted by some form of navigation system, similar to the GPS system on Earth. US space agency NASA successfully sent an uncrewed Orion spacecraft into lunar orbit and back earlier this year as part of the Artemis I mission, and plans to repeat the trick with a human crew for Artemis II in 2024. The space agency’s long-term plans are to build a sustainable presence on the lunar surface, including at least one moon base. Both NASA and the European Space Agency (ESA) have already conceived of potential GPS-like satellite constellations around the Moon, named LunaNet and Moonlight, respectively, in order to provide accurate position, navigation, and timing (PNT) services for lunar activity. However, it appears that the Earthly GPS systems do not take into account the actual shape of the planet itself, using an approximation based on a rotational ellipsoid that is the best fit for its true shape, which is wider at the equator than the distance between the poles. In the case of the moon, it rotates more slowly, with a rotational period equal to its orbital period around the Earth. This leads to the Moon being more spherical than the Earth, but still not truly spherical. For the mapping of the Moon done so far, it has been sufficient to approximate the shape of a sphere, but the last time such calculations were made was in the 1960s. With greater lunar activity planned for the future, a more accurate representation is called for. Scientists at the Faculty of Science of Eötvös Loránd University (ELTE) at Budapest in Hungary have sought to address this by calculating the parameters of the rotating ellipsoid that best fit the theoretical shape of the Moon. The scientists, Kamilla Cziráki, a second-year geosciences student specialising in geophysics, and Gábor Timár, head of the Department of Geophysics and Space Sciences, made use of lunar surface data derived from the NASA GRAIL mapping mission, and took height samples at evenly spaced points on the surface, then used these to search for the axes of a best-fit rotational ellipsoid. This is where Fibonacci comes in. One of the simplest solutions to equally distribute a given number of points on a spherical surface is the Fibonacci Sphere, related to the Fibonacci sequence, which the medieval Italian mathematician is credited with introducing to Europe. He also helped Europe kick the habit of using Roman numerals in favor of Indo-Arabic digits, paving the way for major advances in European mathematics. - After scaring the world, China shows off 'chute that can aim Long March rockets' descents - BOFH: No, the Fabinocci sequence - Da Vinci Code judgment decoded - If we plan to live on the Moon, it's going to need a time zone An article describing the physicists' work has been published in the academic journal Acta Geodaetica et Geophysica. The abstract states that in the case of LunaNet, which NASA plans to create for geographic information system (GIS) applications, the reference surface is currently planned to be the Lunar Reference Frame Standard, a sphere with a radius of 1,737.4km. However, in the future, it is conceivable that a rotational ellipsoid will be needed to replace it. The scientists also performed the same calculations for the Earth, with the aim of showing that this method can give a good estimate of the ideally fitting ellipsoid. Their results differed from the existing model closely, with a deviation of only 60cm (24in). ®
Space Technology
Humanity is slowly losing access to the night sky, and astronomers have invented a new term to describe the pain associated with this loss: "noctalgia," meaning "sky grief." Along with our propensity for polluting air and water and the massive amounts of carbon we're dumping into the atmosphere to trigger climate change, we have created another kind of pollution: light pollution. Most of our light pollution comes from sources on the ground. While humans have had campfires and handheld lanterns for ages, the amount of light we produce through electricity is astounding. We light up our office buildings, streets, parking lots and homes. Of course, some of this lighting is needed for safety and security, but much of it goes to waste. Plus, until we became more aware of light pollution, we tended to allow lighting to spill in every direction, both toward the areas we were trying to illuminate and straight up into the night sky. Ironically, switching to efficient LED lighting often exacerbates the problem. Because those kinds of lights are so inexpensive to operate and last so long, many city and building planners just assume the lights can be left on all night, without any consideration of the cost or replacement. Only in the most remote deserts, wilderness areas and oceans can you find a sky as dark as our ancestors knew them. More recently, the explosive growth in satellite communication "constellations," like SpaceX's Starlink system, has put orders of magnitude more satellites into orbit than even a decade ago, with even more on the way. Those satellites don't just spoil deep-space astronomical observations when they cross a telescope's field of view; they also scatter and reflect sunlight from their solar arrays. The abundance of satellites is causing the overall brightness of the sky to increase all around the globe. Some researchers have estimated that, on average, our darkest night skies, located in the most remote regions of the world, are 10% brighter than they were a half century ago, and the problem is only getting worse. The loss of the night sky has several tangible and cultural impacts. We are losing a rich tradition of human cultural knowledge; cultures around the world and throughout history have used the sky as a springboard for the imagination, painting heroes, monsters and myths in the constellations. Nowadays, city dwellers are lucky to see even the brightest stars in the sky, let alone the faintest sketch of a familiar constellation. These millennia-old sky traditions aren't just random stories meant to entertain around the fire; they are often cornerstones of entire cultures and societies. We all share the same sky, and anyone from the same culture can identify the same constellations night after night. The loss of that access and heritage is a loss of part of our humanity. Many animal species are suffering as well. What good are night-adapted senses in nocturnal species if the night sky isn't much darker than the daytime sky? Researchers have identified several species whose circadian rhythms are getting thrown off, making them vulnerable to predation (or, the reverse: the inability to effectively locate prey). Given the harmful effects of light pollution, a pair of astronomers has coined a new term to help focus efforts to combat it. Their term, as reported in a brief paper in the preprint database arXiv and a letter to the journal Science, is "noctalgia." In general, it means "sky grief," and it captures the collective pain we are experiencing as we continue to lose access to the night sky. Thankfully, there is a way to tackle noctalgia, just as there are ways to combat climate change. On the ground, efforts have sprung up across the globe to create dark-sky reserves, where surrounding communities pledge not to encroach with further expansions of light pollution. Still, those are usually in extremely remote and inaccessible regions of the globe, so other efforts have focused on working with community and business leaders to install night-friendly lighting, such as devices that turn off automatically or point only at the ground (or are simply not used at all). Tackling satellite-based pollution is another matter, as that will require international cooperation and pressure on companies like SpaceX to be better stewards of the skies they are filling with equipment. Still, it's not impossible, and hopefully someday, noctalgia will be a thing of the past.
Space Technology
There is an allure to moon dust. Just like its unique adhesive properties, causing it to stick to everything it touches, it tends to draw the attention of everyone who sees it, regardless their walk of life. It is appropriate then that Col&MacArthur named its moon dust-infused timepiece the LUNAR1,622. "It's the gravity on the moon," said Sebastien Colen, founder and CEO of the Belgium-based watch company, referring to the numerical designation of their new release and the gravitational pull of the moon as compared to Earth. "Every time we release a new collection, we want to make that link to the gravity of the body." Launched on the Kickstarter crowdfunding platform on Wednesday (Oct. 4), the first day of World Space Week, Col&McArthur is offering several configurations of the LUNAR1,622, ranging from a pledge of $399 for a version without moon dust to $599 and $999 for models with moon dust and either Japanese Miyota or Swiss Sellita watch movements and steel or titanium bodies. Looking at the moon dust-imbued LUNAR1,622, your eye is immediately attracted to the 3 o'clock position where a chamber on the face holds the dust from a lunar meteorite found in Northwest Africa in 2017. Secured within the chamber, the light gray material looks like a miniature swath of the lunar surface. "We wanted to [have] the moon dust loose in the watch, but it is so thin that it was coming up with an electrostatic field and sticking to the glass. So, once you got the watch and shook it a bit, it was like a painting and you would no long be able to see past the monochrome coating," Colen told collectSPACE.com. "So we decided to stick the dust to a platform to improve the rendering." The lunar dust is certified as authentic by MSG Meteorites, a British company that is licensed by the International Meteoritic Collectors Association (IMCA). A lunar meteorite is a piece of the moon that through natural processes broke off and then fell to Earth. They are identified in part by comparing their geological properties to the moon rocks brought back by the Apollo astronauts and later by robotic sample return missions. Even without the moon dust, Col&MacArthur's LUNAR1,622 has been designed to pay tribute to the Apollo missions of the past and Artemis lunar landings of the near future. On the standard LUNAR1,622 model, which omits the moon dust, an image of an Apollo moon boot print takes the lunar material's place. The dial on all of the models displays a 3D rendering of the moon, emphasizing its seas ("mares") and crater-pocked topography. "Along the edge of the [moon dust] chamber, we have the dates when Apollo 11 landed on the moon with the initials 'N' and 'A' for Neil Armstrong," Colen said, adding that Armstrong's famous first words spoken on the surface of the moon — 'One small step for man, one giant leap for mankind' — are inscribed opposite the chamber on the bezel. "The other thing is, around the dial we have the Apollo mission [roman numerical designations] that either went into orbit around the moon or landed on the moon. The missions that went into orbit are printed in white and those that landed appear in gold on the index of the dial," said Colen. The case back displays NASA's logo in white and blue. "For that, we had to get the approval of NASA," Colen told collectSPACE. "We had to submit the design to NASA, which has very specific guidelines for its logo's use. We finally got the approval in June after it took maybe six months to get. It was not that easy." Hidden under the space agency's logo (within the watch) is an NFC (Near Field Communication) chip that enables the LUNAR1,622 to interact with smartphones (or other NFC readers) to display info about the owner and the source meteorite for the lunar dust. (The NFC technology is initially only available with the premium model, but will be unlocked for all models if the Kickstarter campaign reaches $200,000 in the first 24 hours.) Colen is hoping the LUNAR1,622 draws similar, if not more, interest than their first watch in the Col&McArthur Interstellar series. In May 2022, the RED 3,721 debuted on Kickstarter featuring a Mars theme and dust from a Martian meteorite. That campaign raised more than $390,000 from nearly 600 backers. "I would say the mission for Col&MacArthur is commemorating the history while having a bridge to a better future, so I think that the moon here on the LUNAR1,622 is commemorating the past, but also represents what it brings to the future," said Colen. "It's been 55 years since we've been on the moon, and it's only just now that we're thinking about going again. We think the LUNAR1,622 is a nice way to connect our past to the future."
Space Technology
The ability to have access to the Internet or use a mobile phone anywhere in the world is taken more and more for granted, but the brightness of Internet and telecommunications satellites that enable global communications networks could pose problems for ground-based astronomy. University of Illinois Urbana-Champaign aerospace engineer Siegfried Eggl coordinated an international study confirming recently deployed satellites are as bright as stars seen by the unaided eye. "From our observations, we learned that AST Space Mobile's BlueWalker 3 -- a constellation prototype satellite featuring a roughly 700 square-foot phased-array antenna -- reached a peak brightness of magnitude 0.4, making it one of the brightest objects in the night sky," Eggl said. "Although this is record breaking, the satellite itself is not our only concern. The untracked Launch Vehicle Adapter had an apparent visual magnitude of 5.5, which is also brighter than the International Astronomical Union recommendation of magnitude 7." For comparison, the brightness of the stars we can see with an unaided eye is between minus 1 and 6 magnitude, minus 1 being the brightest. Sirius, the brightest star, is minus 1. Planets like Venus can sometimes be a bit brighter -- closer to minus 4, but the faintest stars we can see are roughly magnitude 6. "One might think if there are bright stars, a few more bright satellites won't make a difference. But several companies plan to launch constellations," Eggl said. "For example, Starlink already has permission to launch thousands of satellites, but they'll probably get their full request of tens of thousands granted eventually. "And that's just one constellation of satellites. Europe and China want their own constellations and so does Russia. Just those in the United States being negotiated with the FCC amount to 400,000 satellites being launched in the near future. There are only 1,000 stars you can see with the unaided eye. Adding 400,000 bright satellites that move could completely change the night sky." Eggl is a member of the International Astronomical Union Centre for the Protection of the Dark and Quiet Sky from Satellite Constellation Interference, IAU. "BlueWalker 3 is so bright that most of the big telescopes such as the Rubin Observatory believe it could obliterate large parts of exposures," Eggl said. "They already have to avoid observing Mars and Venus for the same reason, but we know where the planets are so we can dodge them. We cannot accurately predict where all the satellites will be years in advance. Just accepting recurring data loss in multi-billion-dollar observatories is not an option either." He said although satellites won't necessarily damage the telescope's CCDs, or charge-coupled devices, they will still cause data loss from the streaks. Extremely bright satellites could ruin the entire field of view, like trying to stargaze when someone periodically shines a flashlight into your eyes. Eggl said several solutions to the problem are being explored in collaboration with the Laboratory for Advanced Space Systems at Illinois and satellite operators such as SpaceX. "Starlink is looking at making their satellites' surfaces darker, which absorbs more and reflects less visible sunlight. But the absorption generates heat. The satellites then have to emit infrared light which means observations in optical wavelengths don't have as large of a problem, but infrared observations might. And heat is one of the biggest engineering problems that we have in space. So, painting everything black comes with repercussions," he said. Another idea from SpaceX is to make satellites' solar panels more reflective with dielectric mirrors. The mirrors allow the satellites to change the direction of the reflection so that it's not pointing directly at the Earth. "If SpaceX can make the solar panels point in a different direction to avoid glints, or use these mirror tricks, they might solve a lot of the problems we have with the optical flaring of Starlink satellites," Eggl said. "With other providers, it's not quite as easy. AST has gigantic satellites, with hundreds of square feet of electronic phased arrays, that they need to communicate with cell phones on the ground. If they made satellites smaller more of their radio signals would leak out through so-called 'side lobes' potentially affecting radio astronomy sites. Eggl said AST also prefers to keep the satellite pointed toward the surface of the Earth to achieve maximum efficiency. Starlink solutions may not easily translate to AST satellites and new mitigation strategies are needed. "We are trying to work with the space industry, where possible," he said. "We want to solve this together so it's a collaborative effort that everybody can sign onto because that's the fastest route to get things done." Ph.D. student Nandakumar analyzed the data for this first international study to be published from the center. Nandakumar works with Jeremy Tregloan-Reed at the Universidad de Atacama in Chile. Story Source: Materials provided by University of Illinois Grainger College of Engineering. Original written by Debra Levy Larson. Note: Content may be edited for style and length. Journal Reference: Cite This Page:
Space Technology
WASHINGTON, July 21 (Reuters) - Some people are two-faced, figuratively speaking of course. The ancient Roman god Janus was two-faced, literally - with one looking forward and another backward, representing transitions and duality. But a two-faced star? Yes, indeed. Scientists have observed a white dwarf star - a hot stellar remnant that is among the densest objects in the cosmos - that they have nicknamed Janus owing to the fact it has the peculiar distinction of being composed of hydrogen on one side and helium on the other. "Janus is the Roman god with two faces, so we thought it was very appropriate. Moreover, Janus is the god of transition, and the white dwarf might be currently transitioning from having an atmosphere made of hydrogen to one made of helium," said Ilaria Caiazzo, a Caltech postdoctoral fellow in astrophysics and lead author of the study published this week in the journal Nature. The star is located in our Milky Way galaxy about 1,300 light years from Earth in the direction of the Cygnus constellation. A light year is the distance light travels in a year, 5.9 trillion miles (9.5 trillion km). Janus is fairly massive for a white dwarf, with a mass 20% larger than that of our sun compressed into an object with a diameter half that of Earth. It rotates on its axis every 15 minutes - very fast considering these stars usually rotate every few hours to a few days. "White dwarfs form at the very end of a star's life. About 97% of all stars are destined to become white dwarfs when they die," Caiazzo said. "Our sun, for example, is currently burning hydrogen into helium in its core. When the hydrogen in the core is depleted, the sun will start burning helium into carbon and oxygen. When the helium also is gone from the center, the sun will eject its outer layers into space in an event called a planetary nebula and the core will slowly contract and become a white dwarf," Caiazzo added. The good news for Earthlings is that it should be 5 billion years before any of that happens to our sun. Janus was spotted using the Zwicky Transient Facility at Caltech's Palomar Observatory near San Diego, with subsequent observations made by other ground-based telescopes. After a white dwarf forms, its heavier elements are thought to sink to the star's core while its lighter elements - hydrogen being the lightest, followed by helium - float to the top. This layered structure is believed to be destroyed at a certain stage in the evolution of some white dwarfs when a strong mixing blends the hydrogen and helium together. Janus may represent a white dwarf in the midst of this transitional blending process, but with the puzzling development of one side being hydrogen while the other side is helium. The researchers suspect that its magnetic field may be responsible for this asymmetry. If the magnetic field is stronger on one side than the other, as is often the case with celestial objects, one side could have less mixing of elements, becoming hydrogen heavy or helium heavy. "Many white dwarfs are expected to go through this transition, and we might have caught one in the act because of its magnetic field configuration," Caiazzo said. Janus is not the only exotic white star known. Caiazzo was part of a research team that in 2021 reported on one with a petite diameter slightly larger than Earth's moon that boasted the greatest mass and littlest size of any known white dwarf. "Every time we look at stars in a different ways, we are bound to be surprised and even baffled sometimes," Caiazzo said. "Stellar phenomenology is extremely rich, and no two stars are the same if looked at closely enough." Our Standards: The Thomson Reuters Trust Principles.
Space Technology
New research shows quasars can be buried in their host galaxies A new study reveals that supermassive black holes at the centers of galaxies, known as quasars, can sometimes be obscured by dense clouds of gas and dust in their host galaxies. This challenges the prevailing idea that quasars are only obscured by donut-shaped rings of dust in the close vicinity of the black hole. Quasars are extremely bright objects powered by black holes gorging on surrounding material. Their powerful radiation can be blocked if thick clouds come between us and the quasar. Astronomers have long thought this obscuring material only exists in the quasar's immediate surroundings, in a "dusty torus" (or donut) encircling it. Now, a team of scientists led by Durham University have found evidence that in some quasars, the obscuration is entirely caused by the host galaxy in which the quasar resides. Using the Atacama Large Millimeter Array (ALMA) in Chile, they observed a sample of very dusty quasars with intense rates of star formation. They found that many of these quasars live in very compact galaxies, known as "starburst galaxies," no more than 3000 light-years across. These starburst galaxies can form more than 1000 stars like the sun per year. To form such a large number of stars, the galaxy needs a huge amount of gas and dust, which are essentially the building blocks of stars. In such galaxies, clouds of gas and dust stirred up by rapid star formation can pile up and completely hide the quasar. The full study has been published in the journal Monthly Notices of the Royal Astronomical Society (MNRAS). Lead author of the study Carolina Andonie, Ph.D. student in the Center for Extragalactic Astronomy at Durham University, said, "It's like the quasar is buried in its host galaxy. "In some cases, the surrounding galaxy is so stuffed with gas and dust, not even X-rays can escape. "We always thought the dusty donut around the black hole was the only thing hiding the quasar from view. "Now we realize the entire galaxy can join in. "This phenomenon only seems to happen when the quasar is undergoing an intense growth spurt." The team estimates that in about 10–30% of very rapidly star-forming quasars, the host galaxy is solely responsible for obscuring the quasar. The findings provide new insights into the link between galaxy growth and black hole activity. Obscured quasars may represent an early evolutionary stage, when young galaxies are rich with cold gas and dust, fueling high rates of star formation and black hole growth. Study co-author Professor David Alexander of Durham University said, "It's a turbulent, messy phase of evolution, when gas and stars collide and cluster in the galaxy's center. The cosmic food fight cloaks the baby quasar in its natal cocoon of dust." More information: Carolina Andonie et al, Obscuration beyond the nucleus: infrared quasars can be buried in extreme compact starbursts, MNRAS (2023). On arXiv: DOI: 10.48550/arxiv.2310.02330 Provided by Durham University
Space Technology
A retired Technical Fellow from Rockwell Collins "released a 65-page paper that details an easy-to-understand, relatively inexpensive, and feasible plan to turn an asteroid into a space habitat," reports Universe Today (in an article republished at Science Alert): Dr. David W. Jensen breaks the discussion into three main categories — asteroid selection, habitat style selection, and mission strategy to get there (i.e., what robots to use)... He eventually settled on a torus as the ideal habitat type and then dives into calculations about the overall station mass, how to support the inner wall with massive columns, and how to allocate floor space. All important, but how exactly would we build such a massive behemoth? Self-replicating robots are Dr. Jensen's answer. The report's third section details a plan to utilize spider robots and a base station that can replicate themselves. He stresses the importance of only sending the most advanced technical components from Earth and using materials on the asteroid itself to build everything else, from rock grinders to solar panels... With admittedly "back-of-the-envelope" calculations, Dr. Jensen estimates that the program would cost only $4.1 billion. That is far less than the $93 billion NASA plans to spend on the Apollo program. And the result would be a space habitat that provides 1 billion square meters of land that didn't exist before. That's a total cost of $4.10 per square meter to build land — in space. Possibly even more impressive is the timeline — Dr. Jensen estimates that the entire construction project could be done in as little as 12 years. However, it will still take longer to fill the habitat with air and water and start regulating its temperature. All important, but how exactly would we build such a massive behemoth? Self-replicating robots are Dr. Jensen's answer. The report's third section details a plan to utilize spider robots and a base station that can replicate themselves. He stresses the importance of only sending the most advanced technical components from Earth and using materials on the asteroid itself to build everything else, from rock grinders to solar panels... With admittedly "back-of-the-envelope" calculations, Dr. Jensen estimates that the program would cost only $4.1 billion. That is far less than the $93 billion NASA plans to spend on the Apollo program. And the result would be a space habitat that provides 1 billion square meters of land that didn't exist before. That's a total cost of $4.10 per square meter to build land — in space. Possibly even more impressive is the timeline — Dr. Jensen estimates that the entire construction project could be done in as little as 12 years. However, it will still take longer to fill the habitat with air and water and start regulating its temperature.
Space Technology
We'll soon get sharper vision on cosmic X-rays. A new satellite aims to study huge objects in the universe, using instruments able to measure the heat of a single X-ray photon. The X-ray Imaging and Spectroscopy Mission (XRISM — pronounced "crism") will analyze X-rays using the widest field-of-view instrument ever implemented in this kind of imaging probe. The instrument will be able to "pry apart high-energy light into the equivalent of an X-ray rainbow," according to a NASA statement. XRISM is scheduled to launch from Japan's Tanegashima Space Center on Aug. 25 (August 26, Japan time zone). Exact time of day has not yet been announced. When the mission launches, you can watch it live here at Space.com. "The mission will provide us with insights into some of the most difficult places to study, like the internal structures of neutron stars and near-light-speed particle jets powered by black holes in active galaxies," Brian Williams, XRISM project scientist at NASA's Goddard Space Flight Center, said in an agency statement. (Active galaxies are large collections of stars with an unusual amount of energy being produced in the center.) The satellite will use a pair of instruments to study massive cosmic phenomena. Examples include the effects of extreme gravity on the behavior of matter, emissions from dense, city-sized star cores known as neutron stars, distant particle jets, and black hole rotations. The first instrument on XRISM, a spectrometer for X-rays, is called Resolve. Each of the pixels in Resolve's 6-by-6-pixel detector can absorb a singular X-ray photon. The instrument's precise capability will let Resolve catalog up to millions of measurements in ultra-high resolution. To do its mission, the instrument must be chilled to super-cold temperatures, near absolute zero. Resolve's housing sits in a special flask (a dewar) of liquid helium, which chills the instrument to around -460 degrees Fahrenheit (-270 Celsius). Resolve's field of view will be enlarged with a complimentary instrument called Xtend. Xtend will allow Resolve to capture images within an area wider than any previous X-ray imaging satellite — an area of the sky about 60 percent bigger than a full moon. Both Resolve and Extend will use twin X-ray mirror assemblies developed at Goddard, nearby Baltimore.
Space Technology
Late last year, a communications satellite unfurled its giant wings, stretching out a 693-square-foot (64-square-meter) antenna array in low Earth orbit. Once fully deployed, BlueWalker 3 became one of the brightest objects in the night sky, outshined only by the Moon, Venus, Jupiter and seven stars, according to new research. Texas-based startup AST SpaceMobile launched its prototype satellite in September 2022 as part of its plan to connect smartphones directly to orbit, essentially establishing cellphone towers in space. As the first of more than 100 satellites set to build an orbital constellation, the prototype satellite is already a major threat to Earth-based observations of the universe. The authors of a new study published today in Nature compiled observations of BlueWalker 3 from professional and amateur astronomers in Chile, the U.S., Mexico, New Zealand, the Netherlands and Morocco. Through the different telescopes, BlueWalker 3 appeared as bright as two of the ten brightest stars in the night sky, Procyon and Achernar. AST SpaceMobile isn’t the only company littering low Earth orbit with satellites. Elon Musk’s SpaceX is building an internet constellation in low Earth orbit with plans to deploy upwards of 42,000 satellites. Amazon is also planning to launch a fleet of 3,236 satellites for Project Kuiper, while OneWeb wants to launch 648 satellites. Those satellites reflect sunlight back to our planet, potentially causing bright streaks across astronomical images and interfering with scientific data. Satellites like BlueWalker 3 might also present an additional source of noise for radio astronomy, interfering with wide band receivers and affecting nearby protected radio astronomy bands, according to the new study. The BlueWalker 3 prototype has the largest-ever commercial communications array deployed in space. During overhead observations in September, the satellite had a brightness magnitude of around +3.5, making it visible to the naked eye. However, since the satellite deployed its antenna array in November of last year, its brightness increased by about two magnitudes, Marco Langbroek, an astrodynamics lecturer at Delft Technical University in the Netherlands, told Gizmodo in November 2022. The authors of the new study are not just concerned about BlueWalker 3 as a lone satellite, but rather that it reflects a trend of increasingly larger and brighter satellites. BlueWalker 3 periodically becomes hundreds of times brighter than the current set of recommendations set forth by the International Astronomical Union (IAU) to mitigate the effects of satellites on the visibility of the cosmos, according to the new study. There are currently no official rules to regulate the brightness of satellites in orbit. Companies like SpaceX have been in talks with the IAU to figure out ways to dim the brightness of the Starlink satellites, which have already interfered with astronomical observations. The authors behind the new study recommended that the effect of satellites on astronomy should be considered as part of their launching authorization processes.
Space Technology
NASA/Getty Images toggle caption A NASA image of one of the twin Voyager space probes. The Jet Propulsion Laboratory lost contact with Voyager 2 on July 21 after mistakenly pointing its antenna 2 degrees away from Earth. NASA/Getty Images A NASA image of one of the twin Voyager space probes. The Jet Propulsion Laboratory lost contact with Voyager 2 on July 21 after mistakenly pointing its antenna 2 degrees away from Earth. NASA/Getty Images NASA has detected a signal from Voyager 2 after nearly two weeks of silence from the interstellar spacecraft. NASA's Jet Propulsion Laboratory said on Tuesday that a series of ground antennas, part of the Deep Space Network, had registered a carrier signal from Voyager 2 on Tuesday. "A bit like hearing the spacecraft's 'heartbeat,' it confirms the spacecraft is still broadcasting, which engineers expected," JPL wrote in a tweet. NASA said it lost contact with Voyager 2, which is traveling 12.3 billion miles away from Earth, on Friday after "a series of planned commands" inadvertently caused the craft to turn its antenna 2 degrees away from the direction of its home planet. What might seem like a slight error had big consequences: NASA said it wouldn't be able to communicate with the craft until October, when the satellite would go through one of its routine repositioning steps. Now that the scientists know Voyager 2 is still broadcasting, engineers will try to send the spacecraft a command to point its antenna back towards Earth. But program manager Suzanne Dodd told the Associated Press that they're not too hopeful this step will work. "That is a long time to wait, so we'll try sending up commands several times" before October, Dodd said. Even if Voyager 2 fails to re-establish communications until fall, the engineers expect it to stay moving on its planned trajectory on the edge of the solar system. Voyager 2 traveled past Uranus and into interstellar space in Dec. 2018 — more than 40 years since it first launched from Cape Canaveral, Fla. To this day, Voyager 2 remains only one of two human-made objects to have ever flown past Uranus. Its primary mission was to study the outer solar system, and already, Voyager 2 has proved its status as a planetary pioneer. Equipped with several imaging instruments, the spacecraft is credited with documenting the discovery of 16 new moons, six new rings and Neptune's "Great Dark Spot." Voyager 2 is also carrying some precious cargo, like a message in a bottle, should it find itself as the subject of another world's discovery: A golden record, containing a variety of natural sounds, greetings in 55 languages and a 90-minute selection of music. Last month's command mix-up means Voyager 2 is not able to transmit data back to Earth, but it also foreshadows the craft's inevitable end an estimated three years from now. "Eventually, there will not be enough electricity to power even one instrument," reads a NASA page documenting the spacecraft's travels. "Then, Voyager 2 will silently continue its eternal journey among the stars." Voyager 2's sister spacecraft, Voyager 1, meanwhile, is still broadcasting and transmitting data just fine from a slightly further vantage point of 15 billion miles away.
Space Technology
A new method helps to measure cosmological distances more accurately After a complex statistical analysis of some one million galaxies, a team of researchers at several Chinese universities, and the University of Cordoba was able to publish the results of the study in the journal Nature Astronomy. For over two years, they had been working on the project, which will make possible the determination of cosmological distances with a new and greater degree of precision. The study developed a new method to detect what are called Baryon Acoustic Oscillations (BAO). These waves, whose existence was first demonstrated in 2005, are one of the few traces of the Big Bang that can still be detected in the cosmos. They spread during the first 380,000 years of the universe's life, expanding like sound waves through matter so hot that it behaved like a fluid, something similar to what happens when a stone is thrown into a pond. Subsequently, the universe expanded and cooled to the point that those waves were frozen in time. The interesting thing about these oscillations, witnesses to almost the entire history of the cosmos, is that their exact duration is known (500 million light years), so they are currently very useful for measuring cosmological distances based on the separation between galaxies. Being able to detect them and determine their size is, therefore, of the utmost importance to correctly map the universe out to very distant points. "The results of this study now allow us to detect these waves through a new and independent method. By combining the two, we can determine cosmic distances with greater precision," explained Antonio J. Cuesta, a researcher at the University of Cordoba's Department of Physics and the sole Spanish author on the study. The new method: Looking for anomalies in the orientation of galaxies This new study analyzed, using statistical methods, a database of approximately one million galaxies, paying special attention to two very different factors: the ellipticity of the galaxies and the density around them. In terms of their orientations, galaxies normally stretch to where there are a greater number of other galaxies, due to the pull of gravity, but there are certain places in the universe where this effect is not as intense. "It is in those points, where galaxies do not point where they should, where statistics tell us that the Baryon Acoustic Oscillations are located, since these waves also act as points of gravity attraction," explained Antonio J. Cuesta. Looking out far, looking into the past "The first practical application that this study could have is to establish more precisely where the galaxies are located, and the separation between them and the Earth, but, in a way, we are also gazing into the past," the researcher explained. This new approach to Baryon Acoustic Oscillations, key to answering some of the great questions about the universe, opens new doors in the world of astronomy. Establishing cosmological distances offers, in turn, new clues about the history of the universe's expansion and helps us to understand its composition in terms of dark matter and energy, two of the most elusive and enigmatic components of the cosmos. More information: Kun Xu et al, Evidence for baryon acoustic oscillations from galaxy–ellipticity correlations, Nature Astronomy (2023). DOI: 10.1038/s41550-023-02035-4 Journal information: Nature Astronomy Provided by University of Córdoba
Space Technology
What it is: Radio images of an annular solar eclipse. When it was taken: Oct. 14, 2023. Where it was taken from: Owens Valley Radio Observatory Long Wavelength Array (OVRO-LWA), California. Why it's so special: Scientists have taken the first radio telescope images of an annular solar eclipse's famous "ring of fire" effect — even though they were outside the eclipse's central path. On Saturday, Oct. 14, 2023, an annular solar eclipse was visible from inside a 125-mile-wide belt through nine U.S. states, including the northeastern tip of California. Owens Valley Radio Observatory, in Big Pine, California, was not within that path, so it was only able to image an 80.5% partial solar eclipse, according to this Interactive Google Map from French eclipse cartographer Xavier Jubier. (Within the path of the eclipse, as much as 91% of the sun's light was blocked). However, radio astronomers were still able to capture radio images of the "ring of fire" because OVRO-LWA detected the sun's corona, or super-hot outer atmosphere, which was invisible to those watching the solar eclipse. "From our observatory site in California we were not in the belt to see the annular eclipse, yet we've been able to 'see' it all clearly unfold in radio, which reveals a much larger solar disk than its visible counterpart thanks to its sensitivity to the extended solar corona," Bin Chen, associate professor of physics at New Jersey Institute of Technology's Center for Solar-Terrestrial Research (NJIT-CSTR) who led the observations with colleagues, said in a statement. Radio astronomy is the study of celestial objects at radio frequencies, which are invisible to human eyes. OVRO-LWA's closeness to the path provided a unique opportunity to study the sun's extended corona with the array's 352 antennas, sampling thousands of radio wavelengths at once. "To finally see a 'ring of fire' eclipse this way was spectacular … we haven't seen this quality of radio imaging of the sun before," Dale Gary, a professor of physics at NJIT-CSTR and co-investigator on the OVRO-LWA project, said in the statement. "We normally cannot see the corona from the ground except during a total eclipse, but we can now see it all the time with OVRO-LWA. This eclipse makes it that much more dramatic." The annular solar eclipse on Oct. 14 also crossed parts of Mexico, Belize, Honduras, Nicaragua, Panama, Colombia and Brazil. The next solar eclipse visible from North America will be a total one, crossing from the southwest to the northeast on April 8, 2024. Live Science newsletter Stay up to date on the latest science news by signing up for our Essentials newsletter. Jamie Carter is a freelance journalist and regular Live Science contributor based in Cardiff, U.K. He is the author of A Stargazing Program For Beginners and lectures on astronomy and the natural world. Jamie regularly writes for Space.com, TechRadar.com, Forbes Science, BBC Wildlife magazine and Scientific American, and many others. He edits WhenIsTheNextEclipse.com.
Space Technology
It isn't just your refrigerator that has magnets on it. The earth, the stars, galaxies, and the space between galaxies are all magnetized, too. The more places scientists have looked for magnetic fields across the universe, the more they've found them. But the question of why that is the case and where those magnetic fields originate from has remained a mystery and a subject of ongoing scientific inquiry. A new paper by Columbia researchers offers insight into the source of these fields. The team used models to show that magnetic fields may spontaneously arise in turbulent plasma. Plasma is a kind of matter often found in ultra-hot environments like that near the surface of the sun, but plasma is also scattered across the universe in low-density environments, like the expansive space between galaxies; the team's research focused on those low-density environments. Their simulations showed that, in addition to generating new magnetic fields, the turbulence of those plasmas can also amplify magnetic fields once they've been generated, which helps explain how magnetic fields that originate on small scales can sometimes eventually reach to stretch across vast distances. The paper was written by astronomy professor Lorenzo Sironi, astronomy research scientist Luca Comisso, and astronomy doctoral candidate Ryan Golant. "This new research allows us to imagine the kinds of spaces where magnetic fields are born: even in the most pristine, vast, and remote spaces of our universe, roiling plasma particles in turbulent motion can spontaneously give birth to new magnetic fields," Sironi said. "The search for the 'seed' that can sow a new magnetic field has been long, and we're excited to bring new evidence of that original source, as well as data on how a magnetic field, once born, can grow." Story Source: Journal Reference: Cite This Page:
Space Technology
Amazon is set to launch two satellite prototypes for its Project Kuiper network, which will eventually number more than 3,200 orbiters. Project Kuiper could become a rival to SpaceX’s Starlink constellation, which is now nearly 4,800 strong. Amazon’s launch is planned for 2 pm Eastern time today, with a backup launch window tomorrow. This rapid growth of the satellite industry has come at a cost for astronomers and fans of the night sky, as two new studies and panels at an international astronomy conference stressed this week. All spacecraft in low Earth orbit reflect sunlight, and some glint enough to be visible to the naked eye—artificial constellations that compete with stellar ones. Satellites can cause problems for astronomers when they streak across images, interfere with radio observations, or make hard-earned data less scientifically useful. By one estimate, there could be some 100,000 satellites swarming the skies in the 2030s. While scientists are mainly concerned about this aggregate effect, some individual satellites are very bright indeed. A study published in the journal Nature this week shows that a prototype of AST SpaceMobile’s BlueBird swarm has become one of the brightest objects in the heavens. Another study documents how even deliberately darkened satellites are still twice as bright—if not more—than the limit astronomers have called for to minimize effects on space science.Such concerns prompted a major conference this week, organized by the International Astronomical Union’s Centre for the Protection of the Dark and Quiet Sky from Satellite Constellation Interference, known as CPS. It’s being held in the Canary Islands, where there are several observatories. It’s the first in-person meeting of its kind, bringing together scores of astronomers, as well as satellite industry representatives, advocates of Indigenous and environmental perspectives, and policy experts.“We’re on the cusp of a new era with a crowded, large zoo of satellites. Having a bunch of bright satellites in the sky will be very disruptive to astronomy,” says Aparna Venkatesan, an astrophysicist at the University of San Francisco who spoke at the meeting about environmental and cultural views of the night sky. She coauthored an earlier study about how satellite proliferation boosts the risks of collisions in low Earth orbit and increases the amount of space junk. The CPS meeting was delayed multiple times because of Covid and a volcano eruption, so it’s long overdue, Venkatesan says. “But in a way, waiting has been a gift, because the astronomers and modelers and data takers have been able to organize.”FILE–The United Launch Alliance Atlas V rocket is transported from the Vertical Integration Facility to Space Launch Complex-41 at Cape Canaveral, Florida, in preparation to launch Amazon's Project Kuiper Protoflight mission.Photograph: United Launch AllianceAstronomers are concerned that bright satellites can photobomb images and interfere with radio receivers, degrading astronomical data. A team working on the Vera Rubin Observatory in the Chilean Andes, which will become one of the most powerful telescopes on Earth when it opens next year, has proposed a brightness limit of apparent magnitude 7. (Apparent magnitudes describe how bright something appears on Earth, not its absolute brightness. A distant galaxy can have a fainter magnitude than a nearby star or a much closer satellite.) But most members of satellite constellations glow much brighter than that, at least part of the time.Satellite networks also create a diffuse light in the night sky, even from orbiters that aren’t individually visible. That light will only brighten if satellites collide, creating reflective bits of flying junk that can’t be masked in images. Starlink satellites have been involved in many near misses, including flying near China’s Tiangong space station. While ground telescopes are the most impaired, a few space telescopes, especially Hubble, have been affected too. Since Hubble orbits slightly below some networks of satellites, a small but increasing percentage of its images have streaks in them.The conference organizers emphasize that astronomers generally don’t oppose satellite constellations, which can deliver broadband access, navigation, and other important services. “The potential benefits to humanity are great, but so are the associated concerns. Creative solutions and technological innovation are needed to confront and solve these problems,” the conference’s website states. But the attendees are struggling to address interference thanks to satellites’ growing numbers. “From the astronomy point of view, there’s nothing we can do to stop this. It’s time to mitigate the effects and reduce the impacts,” says Mike Peel, an astronomer at the Instituto de Astrofísica de Canarias, who co-leads the CPS’s group focused on adapting observation strategies.Astronomers like Harrison Krantz at the University of Arizona are using telescopes to bear witness to these challenges. “These satellites are going to make astronomy more difficult, but not impossible. Let’s assess the situation and see what tools we have at our disposal,” Krantz says. For example, sometimes astronomers can use software that masks pixels affected by streaking. They can also time some observations to avoid clusters of satellites, or avoid pointing their telescopes where satellites are brightest. Krantz and his colleagues recently published the results of a 2.5-year comprehensive survey finding that, despite some astronomers’ assumptions, satellites tend not to be brightest at zenith, or directly overhead, where they’re closest in range. Instead they’re brightest at mid-elevations opposite the sun. Adapting observations isn’t always possible, though—meaning some crucial data will be lost. Satellites also have a long history of interfering with radio telescopes, including the LOFAR network of low-frequency antennas and the Atacama Large Millimeter Array in Chile. Radio signals and electromagnetic radiation from satellites can create static that mimics signs of the cosmic phenomena astronomers are trying to study. “It was always clear that satellites would have this effect, because all electronics have this. It’s inevitable. We knew there would be leaked radiation, but we didn’t know till now how much,” says Benjamin Winkel, an astronomer at the Max Planck Institute for Radio Astronomy in Bonn, Germany, who is attending the conference. Winkel coauthored a study published earlier this year about the radiation levels and frequencies measured from 68 Starlink satellites that passed through the LOFAR station’s beam during an hour of observation. Winkel says SpaceX has attempted to move radio traffic to other frequencies when their satellites fly above telescopes, and to keep their radio beams from being pointed too closely at them. But Winkel’s paper concluded those efforts were insufficient, because telescopes are still sensitive to satellites’ internal electronics. “When we looked, something popped up, much brighter than anticipated. It’s not a needle in the haystack,” Winkel says, referring to the electromagnetic radiation from satellites’ onboard electronics.Astronomers at the conference have some key fixes they want from the space industry: to darken satellites to at least magnitude 7, to avoid interfering with “radio-quiet zones” around telescopes, to avoid radio frequency bands near the ones telescopes use, and to share more information with the astronomical community. Winkel points out that international regulations limit how much electromagnetic radiation smartphones and TVs can leak—but so far these rules haven’t been applied to satellites.National regulations and international policies have been moving slower than innovation. Only voluntary changes have emerged so far. For example, SpaceX attempted to add visors to its satellites to block sunlight from hitting the bottom of the chassis, according to a company white paper in 2022. The visors did seem to dim the glint, but they got in the way of a new optical communications system, so the company abandoned the visors, according to SpaceX’s paper. SpaceX has also tried adding coatings to the body of its spacecraft to darken them, and Krantz’s team concluded that it did make them a bit fainter. That’s significant progress, though the satellites are still 2.5 to 6 times brighter than the magnitude 7 threshold astronomers can live with, Krantz says. SpaceX has also begun experimenting with a “dielectric mirror film” to further darken its newest generation of satellites and allow radio waves to pass through them, according to the white paper.Representatives from SpaceX did not respond to WIRED’s requests for comment. But Patricia Cooper, a former SpaceX vice president, told WIRED: “SpaceX has put a lot of money, a lot of time, and a lot of thought into its corrections.” Cooper is now the president of Constellation Advisory LLC, a group that advises satellite companies on policies and regulations. “I am concerned that persistent calls to alarm without a meaningful focus on solutions will deter companies from trying,” she says.In an emailed statement, Amazon spokesperson Brecke Boyd wrote: “As part of our prototype mission, we’ll test an anti-reflection method on one of the two satellites to learn more about whether it’s an effective way to mitigate reflectivity.” The company also plans to use steering and maneuvering capabilities to orient the solar array and spacecraft to minimize reflection from surfaces, according to that statement.Starlink now comprises more than half of all satellites in orbit, and SpaceX is seeking regulatory approval for 30,000 more. Amazon has some catching up to do, though the company plans to fill out its fleet of more than 3,000 by 2029. Both companies’ networks will fly at similar altitudes: between 342 and 392 miles above the Earth. Other networks include OneWeb, which has more than 630 satellites orbiting at a much higher altitude—750 miles. They are therefore dimmer, but take longer to pass out of a telescope’s field of view. AST SpaceMobile’s BlueBird network of communications satellites could number 150 or more, with more than 100 planned for launch by the end of next year. The new Nature paper, produced by a team of about 40 researchers, found that its prototype, BlueWalker 3, launched in 2022, reflects more light than almost any star. It is also rather large by satellite standards, at nearly 700 square feet including its broad solar array. “BlueWalker was a shock to us as to how bright it was. We are also very worried about the impact to radio astronomy,” since one of its downlink frequencies is next to a protected radio band at 42.5-43.5 gigahertz, says John Barentine, one of the study’s coauthors and a conference attendee. A Tucson, Arizona-based astronomer, he is also the executive officer of Dark Sky Consulting, which advises companies and government officials on outdoor lighting to preserve dark night skies.“We are working to address the concerns of astronomers,” wrote Scott Wisniewski, AST SpaceMobile’s chief strategy officer, in an email to WIRED. That includes using roll-tilting flight maneuvers to reduce the satellites’ brightness, and preventing them from transmitting near radio telescopes. The company is also planning to equip its next-generation satellites with anti-reflective materials, Wisniewski wrote.Astronomers and industry representatives have more work to do to find solutions. “Maybe the best we can hope for now is a somewhat uneasy coexistence” with industry, Barentine says. The two have to share a single resource: the night sky. “My hope is that we can find a way to do it that minimizes harm to astronomy,” he says.
Space Technology
Scientists in China are building the world's largest "ghost particle" detector 11,500 feet (3,500 meters) beneath the surface of the ocean. The Tropical Deep-sea Neutrino Telescope (TRIDENT) — called Hai ling or "Ocean Bell" in Chinese — will be anchored to the seabed of the Western Pacific Ocean. Upon completion in 2030, it will scan for rare flashes of light made by elusive particles as they briefly become tangible in the ocean depths. Every second, about 100 billion ghost particles, called neutrinos, pass through each square centimeter of your body. And yet, true to their spooky nickname, neutrinos' nonexistent electrical charge and almost-zero mass mean they barely interact with other types of matter. But by slowing neutrinos down, physicists can trace some of the particles' origins billions of light-years away to ancient, cataclysmic stellar explosions and galactic collisions. That's where the ocean bell comes in. "Using Earth as a shield, TRIDENT will detect neutrinos penetrating from the opposite side of the planet," Xu Donglian, the project's chief scientist, told journalists at a news conference Oct. 10. "As TRIDENT is near the equator, it can receive neutrinos coming from all directions with the rotation of the Earth, enabling all-sky observation without any blind spots." Neutrinos are everywhere — they are second only to photons as the most abundant subatomic particles in the universe and are produced in the nuclear fire of stars, in enormous supernova explosions, in cosmic rays and radioactive decay, and in particle accelerators and nuclear reactors on Earth.. Despite their ubiquity, their minimal interactions with other matter make neutrinos incredibly difficult to detect. They were first discovered zipping out of a nuclear reactor in 1956, and many neutrino-detection experiments have spotted the steady bombardment of the particles sent to us from the sun; but this cascade masks rarer neutrinos produced when cosmic rays, whose sources remain mysterious, strike Earth's atmosphere. Neutrinos pass completely unimpeded through most matter, including the entirety of our planet, but they do occasionally interact with water molecules. As neutrinos travel through water or ice, they sometimes create particle byproducts called muons that give off flashes of light. By studying the patterns these flashes make, scientists can reconstruct the energy, and sometimes the sources, of the neutrinos. But to increase the chances of ghost particle interactions, detectors have to sit under a lot of water or ice. China's gigantic new detector will consist of more than 24,000 optical sensors beaded across 1,211 strings, each 2,300 feet (700m) long, that will bob upward from their anchoring point on the seabed. The detector will be arranged in a Penrose tiling pattern and will span a diameter of 2.5 miles (4 kilometers). When it's operational, it will scan for neutrinos across 1.7 cubic miles (7.5 cubic kilometers). The world's current largest neutrino detector, IceCube, located at the Amundsen-Scott South Pole Station in Antarctica, only has a monitoring area of 0.24 cubic miles (1 cubic km), meaning TRIDENT will be significantly more sensitive and much more likely to find neutrinos. The scientists say that a pilot project will begin in 2026, and the full detector will come online in 2030. "TRIDENT intends to push the limits of neutrino telescope performance, reaching a new frontier of sensitivity in all-sky searches for astrophysical neutrino sources," the researchers wrote in a paper outlining the detector, published Oct. 9 in the journal Nature Astronomy. Live Science newsletter Stay up to date on the latest science news by signing up for our Essentials newsletter. Ben Turner is a U.K. based staff writer at Live Science. He covers physics and astronomy, among other topics like tech and climate change. He graduated from University College London with a degree in particle physics before training as a journalist. When he's not writing, Ben enjoys reading literature, playing the guitar and embarrassing himself with chess.
Space Technology
An encouraging new study has found that the interference from exoplanets—planets that orbit stars outside our solar system—has been overestimated in searches for extraterrestrial signals. The results from the study, released last week in The Astronomical Journal, mean that scientists can concentrate on finer frequency shifts, markedly improving the potential effectiveness of campaigns to sniff out alien technosignatures, namely radio signals. The laudable quest to discover signals from alien civilizations, a campaign known as SETI (Search for Extraterrestrial Intelligence), hinges on sifting through potential signals and differentiating them from “noise,” that is, natural signals. An improved understanding of the noise produced by exoplanets as they orbit their stars can significantly refine the search for alien signals. Indeed, the ability to pinpoint genuine alien transmissions amidst the vast cacophony of space is pivotal for the success of these search campaigns. Hence the importance of the new paper, led by Megan Grace Li, a PhD student at UCLA. Li conducted this research as a National Science Foundation intern for the Research Experience for Undergraduates at the Breakthrough Listen project at the Berkeley SETI Research Center. “This work gives deeper insight into what extraterrestrially transmitted signals may look like if they come from exoplanets, informing not only the parameter space of technosignature searches but also possible interpretations of detected signals,” said Li in a SETI Institute press release. Scientists must consider how exoplanets move relative to Earth while searching for alien signals; this helps them to pinpoint potential sources and determine if detected signals are genuine or just from celestial movements. Now, when scientists try to capture signals from distant exoplanets, the Doppler effect comes into play. And yes, this is the same phenomenon that causes the pitch of an ambulance siren to change as it zooms past. In the case of SETI astronomy, the Doppler effect results in a shift in the frequency of signals due to the relative movement between the transmitting exoplanet and Earth. This variation in frequency, termed the “drift rate,” is influenced by both the Earth’s and the exoplanet’s orbits and rotations. A lower drift rate indicates a more stable signal, which is essential for distinguishing potential alien transmissions from natural interferences. Previously, based on the research of Sofia Sheikh from the SETI Institute, it was believed that, in the most extreme cases, exoplanetary systems exhibited drift rates of up to 200 nHz, prompting Sheikh to propose this value as a threshold (thresholds help scientists prioritize signals that are more likely to be stable and potentially of interest). However, the new research, which took data from over 5,300 known exoplanets from the NASA Exoplanet Archive, found a surprising revelation. In 99% of cases, the drift rate caused by these exoplanets was only 53 nHz. Furthermore, for stars without any known orbiting planets, the drift rate plummeted to a mere 0.44 nHz. In essence, these findings suggest that the previous threshold of 200 nHz may have significantly underestimated the potential stability of signals originating from extraterrestrial civilizations, potentially making it easier for us to detect deliberate transmissions from aliens; the lower drift rate threshold allows SETI researchers to focus on more stable signals, which are easier and faster to analyze. And as Sheikh explained in the press release: “These results imply that, in many cases, the drift rate will be so low that we can prioritize other parameters (such as covering more frequencies or analyzing datasets faster) without worrying that we will miss true signals.” The newly established limits, which include the majority of drift rates generated by stable radio signals from exoplanets, are poised to significantly reduce the time and computing expenses required for future searches, including the one planned by Breakthrough Listen using the MeerKAT telescope. This means that these searches could become almost a thousand times faster and more cost-effective, according to the paper. This is amazing news for future SETI campaigns. The added precision should allow for a more efficient use of resources, akin to searching a specific corner of a haystack for a needle rather than having to scour the entire stack.
Space Technology
One of the most exciting moments in a landing mission is when the first images from the surface of another world are returned back to Earth. This was no less true for the Viking 2 landing on Mars even though it had been preceded by the successful landing of its sister, Viking 1, on July 20, 1976. Viking 2 landed on a fairly smooth northern plain known as Utopia Planitia at 47.64° N, 225.71° W on September 3, 1976 at 22:37:50 GMT. Minutes after landing, Viking 2 used one of its pair of cameras to acquire a black and white image of a 70° by 20° strip in front of the lander about 1.4 meters from the camera which included a view of the footpad. Although superficially similar to the first image returned 45 days earlier by Viking 1 (see “First Pictures: Viking 1 on Mars – July 20, 1976“), the 10 to 20-centimeter rocks in the scene had vesicles or pits suggesting erosion by wind action. After transmission of its first image showing the foreground was completed, the Lander’s second camera started work on a much larger 330° panorama providing a fuller view of the landing site. Taken at about 10 AM local Martian time, the Viking 2 landing site appeared similar to that of Viking 1 save for the pitted rocks and the lack of any dune-like features. The left side of the panorama, looking towards the northwest, shows a featureless horizon about four kilometers distant. On the right side towards the southwest, nearby rocks are silhouetted against the skyline suggesting that they are on a slight rise closer to the Lander. The curved appearance of the horizon is an artifact caused by an 8° tilt of the Lander towards the west and how the camera scanned the scene. Viking 2 took its first color image of the Red Planet on Sol 2 of its mission. The Viking landers each sported a pair of 7.3-kilogram cameras on their upper deck mounted 0.822 meters apart to provide stereo views of the landscape. Unlike the vidicon-based cameras used on NASA’s robotic Surveyor lunar landers a decade earlier which returned individual frames of the scene (see “Surveyor 1: America’s First Moon Lander”), the Viking Lander cameras used a scanning mirror to reflect the scene onto a set of a dozen light-sensitive photodiodes. The nodding motion of the mirror allowed one column of the scene to be scanned before the camera turret rotated stepwise in azimuth to allow the adjacent columns to be scanned one at a time. Each camera could scan up to 342.5° in azimuth and from 40° above to 60° below the horizon. Earlier Soviet Luna, Mars and Venera landers used telephotometers operated on a similar principle (see “Luna 9: The First Lunar Landing” and “Venera 9 & 10 to Venus“). The array of a dozen detectors allowed the scene to be scanned in six spectral bands for color and near infrared imaging at an image scale of 0.12° per pixel or black and white images with a finer image scale of 0.04° per pixel with four different focus steps ranging from 1.9 to 13.3 meters. Each image column scan was broken up into 512 pixels digitized to 6 bits. The scanning rate was synchronized with the 16,000 bits per second transmission rate using the Viking Orbiters as a relay or the 250 bits per second rate for direct transmission to Earth via the Lander’s high gain antenna (seen on the right side of the first panorama returned by Viking 2). The images could also be stored on a 40 megabit tape recorder for later transmission. The Viking 2 lander continued operating far beyond the end of its primary mission on October 5, 1976 returning images and other scientific data for years to come (see Related Reading below). The Viking 2 lander was finally shutdown on April 11, 1980 after 1281 Sols on the surface because of battery failure. Follow Drew Ex Machina on Facebook. Related Reading “First Pictures: Viking 1 on Mars – July 20, 1976”, Drew Ex Machina, July 20, 2021 [Post] “Viking & The First Seismometers on Mars”, Drew Ex Machina, November 21, 2018 [Post] “NASA’s Viking Mission and the Search for Life on Mars: The Experiments”, Drew Ex Machina, July 28, 2022 [Post] General References Michael M. Mirabito, The Exploration of Outer Space with Cameras, McFarland, 1983 Andrew Wilson, Solar System Log, Jane’s Publishing, 1987 The Martian Landscape, NASA SP-425, 1978
Space Technology
In the quest to address climate change and reduce greenhouse gas emissions, detecting methane leaks – a potent contributor to global warming – has become increasingly vital. Researchers are harnessing the capabilities of cutting-edge satellite technology to monitor these leaks from space. Why methane matters in the fight against climate change Methane is a powerful greenhouse gas and is the second-largest contributor to climate warming after carbon dioxide. A tonne of methane, despite its shorter lifespan of about 10 years in the atmosphere, can retain an astounding 30 times more heat than a tonne of carbon dioxide over the course of a century. This means that when it comes to warming our planet, methane is a potent player. But here's the good news: because methane doesn't stick around as long as carbon dioxide, it provides us with an opportunity to take relatively swift climate action. If we reduce methane emissions, we can actually see a tangible reduction in global methane levels within just a decade. This, in turn, helps to mitigate the enhanced greenhouse effect. Now, let's talk about ‘super-emitters’. While methane emitters refer to any source of methane ranging from natural processes like wetlands or human activities such as agriculture, methane super-emitters release a disproportionately large amount of methane compared to other emitters. These are typically found amongst industrial facilities, such as oil and gas operations, coal mines, or even landfills, that have equipment or infrastructure issues leading to significant methane leaks. These super-emitters are the low-hanging fruits in our quest to cut emissions. Fixing these super-emitters doesn't require complex or expensive solutions. In many cases, relatively simple repairs can result in significant climate gains. However, there's a challenge: we first need to identify these super-emitters. That way, we can target our efforts effectively and start making a difference in the fight against climate change. Using machine learning for methane detection The satellite measures methane by observing Earth's atmosphere and, specifically, the shortwave infrared bands. These bands are like unique fingerprints for methane, allowing Sentinel-5P to detect its presence with remarkable precision. This wealth of data plays a critical role in our efforts to comprehend and address the consequences of methane emissions on our climate and environment, making it an indispensable tool in the battle against climate change. Researchers from SRON Netherlands Institute for Space Research have announced a new algorithm that automatically discovers methane super-emitter plumes in Sentinel-5P data using machine learning. It also automatically calculates the associated emissions based on the measured concentrations and concurrent wind speeds. Berend Schuit from SRON explains, “Before, we manually identified the largest emitters, but it remains difficult to search through the millions of Tropomi pixels. A methane plume often only covers a few pixels. We now automatically get a list of detections from the machine learning model every day. “We check those manually every week to make sure we are confident about the detections. What remains, dozens of methane plumes every week, we publish online. We communicate persistent leaks to other satellites with higher resolution so they can precisely identify the source.” “This information is used by the United Nations' International Methane Emissions Observatory to find a solution together with the responsible companies or authorities.” Co-author Bram Maasakkers, from SRON, added, “The dozens of methane plumes that Tropomi detects every week really present a golden opportunity in the fight against global warming.” “If it’s visible from space, it is serious. For the first time, we now get a good global picture of these super-emitters. In our publication, we describe the 2974 plumes that we found in 2021; 45% originate from oil and gas facilities but we also see plumes from urban areas (35%) and coal mines (20%). “We detect human-made emissions with a climate impact that is significantly larger than total greenhouse gas emissions of The Netherlands. In many cases, those leaks are easy to fix.” The paper, published today in Atmospheric Chemistry and Physics, can be accessed by clicking here. A three-tiered approach for methane detection Typically, the detection of methane emissions relies on Copernicus Sentinel-5P. Until fairly recently, scientists have only begun harnessing the power of combining data from multiple satellites to monitor methane emissions from space which included the combined capabilities of Copernicus Sentinel-5P and Sentinel-2 satellites. These high-tech space-based tools work in tandem to monitor and assess methane emissions on a global scale, allowing researchers to not only detect the presence of methane but also to localise and quantify emissions accurately. With daily global coverage, Sentinel-5P is renowned for its high-precision methane measurements and can detect methane leaks anywhere on Earth. However, there’s a catch. The spatial resolution is relatively coarse, at 7x5.5 km. This means it can identify the presence of methane but not pinpoint its source with precision. The Sentinel-2 satellites, on the other hand, are equipped with multi-band instruments that are not designed to observe methane concentrations but can identify precise locations of major methane leaks (emitting more than one tonne per hour) with a remarkable resolution of 20 m. But Sentinel-2 lacks daily global coverage, so it might miss out on capturing crucial data during certain emission periods. But what about the Sentinel-3 mission? The satellites are equipped with multi-band radiometers that can observe shortwave infrared bands which are sensitive to methane concentrations. These satellites offer global coverage on a daily basis and a ground pixel resolution of 500 m. In a recent paper published in Remote Sensing of Environment, researchers from SRON found that the Sentinel-3 satellites can retrieve methane enhancements from its shortwave infrared band measurements. Impressively, it can detect the largest methane leaks of at least 10 tonnes per hour, depending on factors like location and wind conditions, every single day. This puts it in a unique position to identify and monitor methane leaks. Near the Hassi Messaoud oil/gas field in Algeria, researchers identified a continuous methane emission from a leaking facility for six days. The methane plume, detected by Sentinel-5P over Algeria on 4 January 2020, extended for more than 200 km northeast. The team used a Sentinel-2 image to zoom in on the plumes’ origins and pinpointed the exact location of the leak to be an oil/gas well, while Sentinel-3 showed the leak continued for six days. When analysing these leaks, both Sentinel-2 and Sentinel-3 provided similar estimates of methane emissions – showcasing Sentinel-3’s utility in quantifying emissions. Combining the data from these two satellites allows researchers to zoom in with precision, identifying, quantifying and monitoring methane sources corresponding to plumes observed in Sentinel-5P’s global scans. Sudhanshu Pandey, lead author and now scientist at NASA Jet Propulsion Laboratory, commented, “Who would have thought we can use three different Sentinel missions in a tiered approach to first spot methane super emitters from space globally with Tropomi on Sentinel-5P, then zooming in with Sentinel-3 and Sentinel-2, we’re able to identify the exact source responsible at facility level. This is the type of information we need to take swift action.” In the fight against climate change, understanding and mitigating methane emissions are of paramount importance. Sentinel-3, with its unique combination of daily global coverage and high-resolution methane detection, emerges as a valuable asset in the arsenal of tools to track down and address these elusive leaks. As technology advances and our understanding deepens, satellite observations will play a pivotal role in the global effort to combat climate change.
Space Technology
For centuries, humankind has been captivated by the thought of life on other planets. But how will we recognise it when we see it? Researchers have developed an artificial intelligence system that can detect signs of life with 90 per cent accuracy. And they say it signifies a 'significant advance' in our abilities to discover life across the solar system and beyond. Many of the components necessary for life, such as amino acids and nucleotides needed to make DNA, have been detected in space. But it is hard to determine whether they are biotic, meaning they are a sign of life, or if they are abiotic, meaning they relate to non-living things like gases and chemicals. The scientists, from George Mason University in Virginia, devised an AI model that can predict whether a sample is biotic or abiotic with 90 per cent accuracy. They created it by analysing 134 varied samples from a range of living cells, fossil fuels, meteorites and organic compounds. The analysis involved separating the sample into its component parts, and then detecting subtle differences between molecular patterns and weights. Lead researcher Dr Robert Hazen said: 'This is a significant advance in our abilities to recognise biochemical signs of life on other worlds. 'It opens the way to using smart sensors on unmanned spaceships to search for signs of life. 'These results mean that we may be able to find a lifeform from another planet, another biosphere, even if it is very different from the life we know on Earth. 'And, if we do find signs of life elsewhere, we can tell if life on Earth and other planets derived from a common or different origin.' The team believe their model could be used to test samples already collected by the Mars Curiosity rover. It could also help scientists reveal the history of mysterious, ancient rocks on Earth to determine when life began. 'This routine analytical method has the potential to revolutionize the search for extraterrestrial life and deepen our understanding of both the origin and chemistry of the earliest life on Earth,' Dr Hazen said. 'It opens the way to using smart sensors on robotic spacecraft, landers and rovers to search for signs of life before the samples return to Earth.' The findings were published in the journal Proceedings of the National Academy of Sciences.
Space Technology
How massive is the Milky Way? It’s an easy question to ask, but a difficult one to answer. Imagine a single cell in your body trying to determine your total mass, and you get an idea of how difficult it can be. Despite the challenges, a new study has calculated an accurate mass of our galaxy, and it’s smaller than we thought. One way to determine a galaxy’s mass is by looking at what’s known as its rotation curve. Measure the speed of stars in a galaxy versus their distance from the galactic center. The speed at which a star orbits is proportional to the amount of mass within its orbit, so from a galaxy’s rotation curve you can map the function of mass per radius and get a good idea of its total mass. We’ve measured the rotation curves for several nearby galaxies such as Andromeda, so we know the masses of many galaxies quite accurately. But since we are in the Milky Way itself, we don’t have a great view of stars throughout the galaxy. Toward the center of the galaxy, there is so much gas and dust we can’t even see stars on the far side. So instead we measure the rotation curve using neutral hydrogen, which emits faint light with a wavelength of about 21 centimeters. This isn’t as accurate as stellar measurements, but it has given us a rough idea of our galaxy’s mass. We’ve also looked at the motions of the globular clusters that orbit in the halo of the Milky Way. From these observations, our best estimate of the mass of the Milky Way is about a trillion solar masses, give or take. This new study is based on the third data release of the Gaia spacecraft. It contains the positions of more than 1.8 billion stars and the motions of more than 1.5 billion stars. While this is only a fraction of the estimated 100-400 billion stars in our galaxy, it is a large enough number to calculate an accurate rotation curve. Which is exactly what the team did. Their resulting rotation curve is so precise, that the team could identify what’s known as the Keplerian decline. This is the outer region of the Milky Way where stellar speeds start to drop off roughly in accordance with Kepler’s laws since almost all of the galaxy’s mass is closer to the galactic center. The Keplerian decline allows the team to place a clear upper limit on the mass of the Milky Way. What they found was surprising. The best fit to their data placed the mass at about 200 billion solar masses, which is a fifth of previous estimates. The absolute upper mass limit for the Milky Way is 540 billion, meaning that the Milky Way is at least half as massive as we thought. Given the amount of known regular matter in the galaxy, this means the Milky Way has significantly less dark matter than we thought. Reference: Jiao, Yongjun, et al. “Detection of the Keplerian decline in the Milky Way rotation curve.” arXiv preprint arXiv:2309.00048 (2023).
Space Technology
Scientists may have just found what makes the sun's outer atmosphere, the corona, so inexplicably hot. For decades, scientists have been struggling to explain why temperatures in the sun's outer atmosphere, the corona, reach mind-boggling temperatures of over 1.8 million degrees Fahrenheit (one million degrees Celsius). The sun's surface has only about 10,000 degrees F (6,000 degrees C), and with the corona farther away from the source of the heat inside the star, the outer atmosphere should, in fact, be cooler. New observations made by the Europe-led Solar Orbiter spacecraft have now provided hints to what might be behind this mysterious heating. Using images taken by the spacecraft's Extreme Ultraviolet Imager (EUI), a camera that detects the high-energy extreme ultraviolet light emitted by the sun, scientists have discovered small-scale fast-moving magnetic waves that whirl on the sun's surface. These fast-oscillating waves produce so much energy, according to latest calculations, that they could explain the coronal heating. Scientists have previously detected slower magnetic waves, but those didn't seem to produce enough energy to explain the enormous temperature difference between the sun's surface and the outer atmosphere. "Over the past 80 years, astrophysicists have tried to solve this problem and now more and more evidence is emerging that the corona can be heated by magnetic waves," Tom Van Doorsselaere, a professor of plasma physics at the Catholic University of Leuven in Belgium and one of the authors of the new study, said in a statement. The newly discovered structures can be seen in a video sequence captured by the EUI instrument in October last year. Each of the magnetic oscillations, highlighted in blue, green and red rectangles, is less than 6,200 miles (10,000 kilometers) wide. For context, the solar disk measures 864,000 miles (1,392,000 km) in diameter. Solar Orbiter, launched in February 2020, takes the closest images of the star at the center of our solar system. Although Earth-based telescopes can provide images of the sun in a higher resolution, these telescopes can't study the extreme ultraviolet part of the solar light spectrum. Because these frequencies are filtered out by Earth's atmosphere, ground-based telescopes therefore don't see many of the key phenomena driving the sun's behavior. Solar Orbiter, which makes regular approaches to less than 48 million miles (77 million km) from the sun (closer than the orbit of the solar system's innermost planet Mercury), doesn't have those issues. In its first images of the sun alone, released in June 2020, Solar Orbiter found other indications of processes that might play a role in the coronal heating mystery. David Berghmans, the principal investigator of the EUI instrument and solar physicist at the Royal Observatory of Belgium added that the team will now dedicate more time to studying the newly discovered magnetic waves on the sun's surface. "Since her results indicated a key role for fast oscillations in coronal heating, we will devote much of our attention to the challenge of discovering higher-frequency magnetic waves with EUI," Berghmans said in the statement. The study was published on Monday, July 24, in the Astrophysical Journal Letters.
Space Technology
Stepping out on a scorching summer afternoon can be an excruciating prospect, but less so if your trusty umbrella is there to save the day. Now, what if we could similarly build and deploy a life-saving umbrella for our very own planet? At first glance, this brilliant idea might seem quite implausible. But with technological advancements and a few modifications here and there, we could actually build one giant trusty shield for Earth, according to István Szapudi, an astronomer at the University of Hawaiʻi Institute for Astronomy. A solar shield with a tethered twist! Inspired by everyday life, Szapudi has proposed a massive solar shield to protect Earth from the harsh rays of the Sun. Not only would it help cut down some fraction of the Sun's heat, but possibly mitigate global warming as well! But while this idea has been doing rounds in science circles for a while now, the concept is marred by several limitations, the biggest of which is the problem of weight. In outer space, weight translates into unrealistic costs, as it needs to be heavy enough to counteract the gravitational forces and withstand intense solar winds and pressure. This is where Szapudi’s recent observational study comes into the picture. As per his proposal, two novel innovations push this idea from mere theories into the realm of possibilities. First, instead of a big and heavy shield alone, Szapudi suggests using a Sun shield tethered to a counterweight towards the Sun. And secondly, the counterweight in question could be a captured asteroid. By incorporating a tethered asteroid, we would need to launch just the shield structure from Earth. This clever addition significantly brings down the total mass required for the shield. And such a structure would be faster and cheaper to build and deploy than other shield designs. Moreover, estimates suggest that reducing just 1.7% of solar radiation can help prevent a catastrophic jump in global temperatures. To achieve this, Szapudi's calculations revealed that a tethered counterbalance and the shield and counterweight together should weigh around 3.5 million tons — over 100 times lighter than previously estimated untied shield designs. Further estimations revealed that only about 1% of it, approximately 35,000 tons, would be the actual shield to be launched from Earth. The remaining 99% could be made up of lunar dust and materials from asteroids, which can be managed in space itself. Of course, we're not there yet. Present-day rockets can only carry about 50 tons to low-Earth orbit, making shield deployment a tough task. And we’ll also need to develop a strong but lightweight graphene tether to hold the shield and asteroid together. However, as technology takes a quantum leap and more lightweight and robust materials develop, the shield's mass could be reduced even further — making the deployment of this life-saving sunshade possible much sooner than most would think! This study was published recently in the journal PNAS and can be accessed here. ** For weather, science, space, and COVID-19 updates on the go, download The Weather Channel App (on Android and iOS store). It's free!
Space Technology
High-frequency magnetic waves surging through the sun may explain why the temperature of our star's atmosphere is 200 times hotter than its surface. The temperature of the upper atmosphere of the sun, called the corona, can soar to over 2 million degrees Fahrenheit (1.1 million degrees Celsius), while 1,000 miles (1,600 kilometers) closer to the core, the photosphere — the sun's visible surface — simmers at a relatively chilled 10,000 F (5,500 C). This coronal heating problem arises from the sun's main heat source, which is the nuclear fusion occurring at its core. Stellar models suggest that regions farther from the core should see drops in temperature — something the corona defies by being hotter than the underlying photosphere. This is like moving away from a fire only to find the air becoming hotter. Scientists have long suspected that magnetic phenomena could play a role in helping the sun's upper atmosphere maintain its physics-defying high temperatures. Now, observations of small and rapid oscillations in magnetic structures in the corona by the European Space Agency's (ESA) Solar Orbiter spacecraft could finally pinpoint exactly what heats up the corona. "Over the past 80 years, astrophysicists have tried to solve this problem and now more and more evidence is emerging that the corona can be heated by magnetic waves," Tom Van Doorsselaere, a plasma astrophysicist at KU Leuven university in Belgium, said in a statement. Van Doorsselaere is a co-author of a new paper detailing the research, published July 17 in the Astrophysical Journal Letters. Despite being much hotter than the photosphere, the corona is still washed out by light from the underlying photosphere. That means observing it from Earth requires waiting for a lunar eclipse, in which the disk of the moon blocks out the photosphere, or using specialized equipment that replicates the effect. From its position around 26 million miles (42 million km) from the sun, the Solar Orbiter has no such problems. The ESA spacecraft can use its Extreme Ultraviolet Imager (EUI) telescope operated by the Royal Observatory of Belgium (ROB) to create images of the solar corona with unprecedented resolution. The spacecraft, currently observing the far side of the sun from our perspective here on Earth, and EUI's Full Sun Imager and its High-Resolution Imager spotted the small magnetic waves traversing the plasma, a broiling hot gas of charged particles that comprises our star, on Oct. 12, 2022. After EUI revealed these new fast, small-scale oscillations, the team wanted to know if they contributed more energy to coronal heating than previously discovered slower, low-frequency oscillations do. To investigate this, the team conducted a meta-analysis of several previous solar studies. From this analysis, the scientists concluded that high-frequency oscillations do indeed provide significantly more energy for the heating of the corona than their slower counterparts. To confirm the link between coronal heating and high-frequency magnetic waves, scientists will continue observing the sun's outer atmosphere with the Solar Orbiter and its instruments. "Since her results indicated a key role for fast oscillations in coronal heating, we will devote much of our attention to the challenge of discovering higher-frequency magnetic waves with EUI," ROB researcher and EUI principle investigator David Berghmans said in the statement. Live Science newsletter Stay up to date on the latest science news by signing up for our Essentials newsletter. Robert Lea is a science journalist in the U.K. who specializes in science, space, physics, astronomy, astrophysics, cosmology, quantum mechanics and technology. Rob's articles have been published in Physics World, New Scientist, Astronomy Magazine, All About Space and ZME Science. He also writes about science communication for Elsevier and the European Journal of Physics. Rob holds a bachelor of science degree in physics and astronomy from the U.K.’s Open University
Space Technology
Sun 'umbrella' tethered to asteroid might help mitigate climate change Earth is rapidly warming and scientists are developing a variety of approaches to reduce the effects of climate change. István Szapudi, an astronomer at the University of Hawaiʻi Institute for Astronomy, has proposed a novel approach—a solar shield to reduce the amount of sunlight hitting Earth, combined with a tethered, captured asteroid as a counterweight. Engineering studies using this approach could start now to create a workable design that could mitigate climate change within decades. The paper, "Solar radiation management with a tethered sun shield," is published in Proceedings of the National Academy of Sciences. One of the simplest approaches to reducing the global temperature is to shade the Earth from a fraction of the sun's light. This idea, called a solar shield, has been proposed before, but the large amount of weight needed to make a shield massive enough to balance gravitational forces and prevent solar radiation pressure from blowing it away makes even the lightest materials prohibitively expensive. Szapudi's creative solution consists of two innovations: a tethered counterweight instead of just a massive shield, resulting in making the total mass more than 100 times less, and the use of a captured asteroid as the counterweight to avoid launching most of the mass from Earth. "In Hawaiʻi, many use an umbrella to block the sunlight as they walk about during the day. I was thinking, could we do the same for Earth and thereby mitigate the impending catastrophe of climate change?" Szapudi said. Incorporating a tethered counterbalance Szapudi began with the goal of reducing solar radiation by 1.7%, an estimate of the amount needed to prevent a catastrophic rise in global temperatures. He found that placing a tethered counterbalance toward the sun could reduce the weight of the shield and counterweight to approximately 3.5 million tons, about one hundred times lighter than previous estimates for an untethered shield. While this number is still far beyond current launch capabilities, only 1% of the weight—about 35,000 tons—would be the shield itself, and that is the only part that would have to be launched from Earth. With newer, lighter materials, the mass of the shield could be reduced even further. The remaining 99% of the total mass would be asteroids or lunar dust used as a counterweight. Such a tethered structure would be faster and cheaper to build and deploy than other shield designs. Today's largest rockets can only lift about 50 tons to low Earth orbit, so this approach to solar radiation management would be challenging. Szapudi's approach brings the idea into the realm of possibility, even with today's technology, whereas prior concepts were completely unachievable. Also, developing a lightweight but strong graphene tether connecting the shield with the counterweight is crucial. More information: István Szapudi, Solar radiation management with a tethered sun shield, Proceedings of the National Academy of Sciences (2023). DOI: 10.1073/pnas.2307434120 Journal information: Proceedings of the National Academy of Sciences Provided by University of Hawaii at Manoa
Space Technology
The Bluewalker 3 satellite now outshines nearly every star in the night sky. The object's brightness could impede the study of the skies, according to a group of astronomers. Astronomer Patrick Seitzer told Nature the night sky will be "irreversibly changed" if more satellites like it are launched. A new satellite now outshines nearly every star in the night sky. With more such objects due to be launched soon, the night sky could be irreversibly changed, scientists warn. This finding was part of a study published on Monday in the journal Nature. According to the report, a group of astronomers found that the newly launched Bluewalker 3 satellite now outshines every celestial body in the night sky other than seven stars, the Moon, Jupiter, and Venus. The study's authors — who observed Bluewalker 3 for 130 days — wrote that large constellations of bright and artificial satellites in low Earth orbit could impede the study of the skies. The Bluewalker 3 satellite was launched in September last year by telecommunications company AST SpaceMobile — in collaboration with Vodafone, AT&T, and Nokia — and enables 5G connectivity between phones without the use of a cell tower. Patrick Seitzer, an emeritus astronomer at the University of Michigan, Ann Arbor, told Nature that if more companies were to launch fleets of large satellites like AST SpaceMobile, "then the night sky will be irreversibly changed." That's because the Bluewalker 3 satellite can be hundreds of times brighter than the maximum recommended level by the International Astronomical Union, a professional body of astronomers. "These results demonstrate a continuing trend towards larger, brighter commercial satellites, which is of particular concern given the plans to launch many more in the coming years," wrote Siegfried Eggl, one of the study's co-authors and an assistant professor at the University of Illinois Urbana-Champaign, in a separate report. AST SpaceMobile told Nature that it was working with NASA and astronomy groups to address the concerns raised in the study. However, this isn't the first time scientists have sounded the alarm over satellites interfering with the study of space. In March, scientists wrote in a series of articles for Nature that large fleets of satellites like the ones utilized by Elon Musk's SpaceX should be regulated to safeguard our ability to study the skies. It's a long-running concern that dates back to 2019 when some scientists called attention to Starlink satellites having the potential to blot out the stars. "While other constellations may require thousands of satellites to achieve their coverage goals — there could be as many as 58,000 in orbit by 2030, according to a recent US government report — we plan to provide substantial global coverage with around 90 satellites," an AST SpaceMobile spokesperson told Insider on Tuesday. The study's authors did not immediately respond to Insider's requests for comment, sent outside regular business hours. Read the original article on Insider
Space Technology
Simulating space to explore the great mystery of interstellar chemistry The universe is more than 13 billion years old and space is often depicted as a vast, empty vacuum. Other than planets and stars, there's nothing there, right? Actually, space is littered with complex, carbon-based molecules. However, the range of molecules and the chemistry involved in their formation remains largely mysterious. There have been tantalizing hints of complex astrochemistry. For example, prebiotic molecules like amino acids and nucleobases have been detected in meteorites—the most famous landed in 1969 near Murchison, around 140 km north of Melbourne. But to understand the molecular makeup of space, astronomers and astrochemists must go beyond analyzing the meteorites that happen to crash into Earth. To do this, astronomers measure stellar radiation using telescopes, while other scientists simulate interstellar conditions in the laboratory—more on how our team uses this technique later. Observing astronomical molecules Of the approximately 240 molecules now discovered in space, most have been revealed using radio telescopes. The James Web Space Telescope (JWST)—the largest space telescope ever launched—is designed to image very distant objects and the emission from chemical species in the mid-infrared, which can be used to identify elements and molecules. The high sensitivity and resolution of the JWST has already made possible several recent milestone observations of astronomical molecules that help unravel the nature and origin of chemical complexity in the universe. These include recent evidence of complex organic molecules in the galaxy SPT0418-47, located 12.3 billion light-years away—the most distant and oldest organic molecules that have ever been detected. The molecules are polyaromatic hydrocarbons (PAHs) consisting of two or more fused aromatic rings of carbon, which are ubiquitous on earth, found in living systems, fossil fuels, and are implicated in the chemistry involved in the origin of life. Another key discovery is the observation of methyl cation methylium (CH3+) in the disk surrounding a newly formed star. That region is exposed to intense ultraviolet light from the hot young star, providing the energy necessary to form CH3+, which has the molecular structure equivalent to methane minus one hydrogen atom and plays an important role in the formation of more complex carbon-based molecules. These and other ingredients for complex chemistry are ejected by novae, supernovae and other massive cosmic events into the vast region between stars—the interstellar medium. Although we can't just travel lightyears away to collect and study interstellar molecules, astronomers can detect molecules in the interstellar medium by collecting the light emitted by stars. Since the light travels such an enormous distance before reaching the detectors on earth, it has an appreciable chance of being absorbed by molecules in interstellar clouds. The wavelengths of light that are absorbed give a spectrum that contains fingerprints of the molecules in space. Intriguingly, more than 500 absorption lines have been found in the visible and near-infrared region of the spectrum, which are known as the diffuse interstellar bands (DIBs). Several of the most prominent DIBs were first observed by Ph.D. student Mary Lea Heger in 1919 at the Lick Observatory in California (U.S.), with many more DIBs discovered since. What are interstellar molecules made of? While it is now clear that DIBs arise from molecules in the interstellar medium, the structure and composition of the molecules responsible for almost all DIBs remain a mystery. In fact, only one molecule has been identified as the source of any features in the DIBs spectra, the carbon cluster buckminsterfullerene (C60+), which has the appearance of a tiny soccer ball. Because C₆60+ is the only confirmed DIB carrier, it is reasonable to expect that other DIBs may be due to large carbon clusters consisting of between 10 and 100 atoms. Carbon clusters have a diverse range of sizes and molecular shapes; however, the wide range of possible structures complicates their detection. Simulating space In the basement of the Chemistry building in the Laser Spectroscopy Laboratory, Dr. Samuel Marlton, Ph.D. candidate Chang Liu, and Professor Evan Bieske employ a home-built apparatus to generate, separate and isolate individual carbon cluster structures under gas-phase conditions that resemble the cold vacuum of space. With this apparatus, the team compare the astrophysical data measured by astronomers with laboratory data for specific carbon cluster structures. In recent work published in the Journal of Physical Chemistry A, the research team examined the absorption spectra of Colossal Carbon Rings, carbon aggregates containing between 14 and 36 atoms arranged as planar rings. In that work, they found evidence for a carbon ring, C14+ (made from 14 carbons) that might contribute to the DIBs spectra. It is an exciting result because almost all of the more than 500 DIBs are due to molecules that remain unidentified, which illustrates the ongoing mysteries surrounding the molecular makeup of interstellar space. Although JWST is taking astronomical measurements at lower energy than reported here, both are part of the project to detect interstellar molecules through combined astronomical and laboratory investigations. Together, these techniques are just beginning to reveal what is out there in the universe, and it's so much more than empty space. More information: Samuel J.P. Marlton et al, Probing Colossal Carbon Rings, The Journal of Physical Chemistry A (2023). DOI: 10.1021/acs.jpca.2c07068 Journal information: Journal of Physical Chemistry A Provided by University of Melbourne
Space Technology
Researchers develop bioinspired geolocation method based on daytime sky polarization The first guy on Earth who ever got lost probably said to himself, "I could really use a set of geographic coordinates expressed as latitude and longitude right about now." Time passed, neocortexes evolved, and eventually, compasses and sextants gave way to global navigation satellite systems for geoposition and navigation. However, these systems are often unreliable and susceptible to jamming and spoofing. Magnetic compasses, while still useful, are subject to magnetic interference, and celestial navigation is only possible on a clear night without the interference of light pollution. A French research team developing an alternative geoposition method sought examples of geolocation methods from biology. While many species of birds and insects calibrate their magnetic compasses via the movements of stars around the Earth's magnetic pole, they noted that some migrating bird species calibrate their internal compass during the day based on skylight polarization patterns. The theory suggests that sunlight is scattered by small particles present in the Earth's atmosphere, and over the years, researchers have developed GPS-free navigational techniques exploiting this model. The researchers cited the specific example of the Cataglyphis desert ant. These ants forage in daylight for dead insects, hunting in a zigzag pattern. Each time an ant changes direction, it raises its head and takes a reading on the sun. When it finds a food source, it returns directly to the nest in a straight line to minimize its direct exposure to sunlight in the high heat of the desert, essentially computing its course via biological skylight polarization analysis. The research team now reports a system called Skypole that uses a polarimetric camera to measure the degree of skylight polarization rotating with the sun. By processing captured images of the sky, the team was able to determine the position of the north celestial pole and pinpoint the observer's latitude and bearing. Their study is published in the Proceedings of the National Academy of Sciences. It is generally accepted that animals use skylight polarization to determine geolocation, but it is unclear how they use that information. The team took a cue from a theory that animal species like Cataglyphis use temporal properties of the skylight polarization pattern to position themselves, and developed a system that compares polarimetric images of the sky in order to find true north. The team compared images taken at two distinct moments with time intervals ranging from 30 to 60 minutes and computed the differences in two features: the degree of linear polarization and the angle of linear polarization. Their algorithm accounts for the constancy of the degree of linear polarization at the celestial north pole, as well as the two variables captured in the images. The method they describe has a number of advantages: Using only visual information, the system derives geolocation with reasonable accuracy, without reliance on time, date or initial position. Their image processing package is minimal and operates with modest computing resources. However, the image acquisition intervals and the system's degree of accuracy currently preclude certain geolocation applications. The researchers write, "It is worth noting, however, that this algorithm has been kept as simple as possible and that a more sophisticated data processing algorithm would no doubt greatly improve the accuracy … in future studies, special emphasis should be placed on image filtering in order to reduce the influence of noise." Additionally, they note that their study could contribute a new hypothesis regarding the use of visual information by animals for geolocation. More information: Thomas Kronland-Martinet et al, SkyPole—A method for locating the north celestial pole from skylight polarization patterns, Proceedings of the National Academy of Sciences (2023). DOI: 10.1073/pnas.2304847120 © 2023 Science X Network
Space Technology
A new machine-learning algorithm aims to provide improved measurements of stellar ages, allowing astronomers to better model how stars evolve. The algorithm is an AI version of a project called EAGLES, which stands for Estimating Ages from Lithium Equivalent Widths. EAGLES uses the lithium abundance of stars to determine their age. Previously, this work had been done by fitting data to graphs. With surveys producing more and more data, this task has grown time-consuming and complex, so an AI has been written to take on the job. All stars are born containing the same proportion of lithium, but as they age, they lose this lithium at different rates depending on their masses and therefore temperatures (since the more massive the star, the hotter the temperature, which astronomers use as a proxy because they can’t measure the mass of the star directly). The hotter a star, the greater the rate of convection in that star's outer layers and the more this churns up the lithium on a star’s surface. As lithium sinks into a star's interior, it's converted into two helium nuclei by fusing with a proton, and the result is that the lithium is increasingly depleted as time goes by. Therefore, the abundance of lithium observed in a star, coupled with the star’s temperature, should together provide a measure of that star’s age. Traditionally, astronomers measure a star's age with lithium by looking at the strength of the lithium spectral line in a star's spectrum (which is what 'equivalent widths' in EAGLES' name refers to) then trying to fit it to models of stellar evolution. Not only is this method "difficult to do and requires a lot of work," but scientists also want to expand beyond lithium abundances to include other stellar properties that can indicate age as well, said George Weaver of the United Kingdom's Keele University in an interview with Space.com. Weaver and his supervisor, University of Keele astrophysicist Robin Jeffries, have thus introduced artificial intelligence to take on some of the workload, particularly when handling lots of information covering the other age indicators coming in from big, all-sky surveys. In these surveys, it's possible the AI can find new, previously undiscovered relationships in stars' data. Reconstructing star-formation history Astronomers can more easily measure the relative ages of stars in star clusters, because a cluster’s stars were all born at the same time, meaning they can be directly compared based on how they have evolved. With this in mind, Weaver and Jeffries sampled 6,000 stars, from a total of 52 clusters, observed by ESA's Gaia mission. Then, they trained the EAGLES algorithm on the selected stellar bodies. "A stellar evolution model tells you what a star should look like as a function of age," Jeffries told Space.com. "Clearly, if we have stars whose age we know, then that’s very helpful when comparing to stellar evolution models." AI is being used increasingly often in astronomy as a method of handling big data, and EAGLES is no different. It will soon be applied to two deep surveys, beginning with the WEAVE (WHT Enhanced Area Velocity Explorer) survey on the William Herschel Telescope at La Palma this year, and continuing in the 4MOST (4-meter Multi-Objects Spectrograph Telescope) survey on the VISTA (Visible and Infrared Survey Telescope for Astronomy) telescope at the European Southern Observatory in Chile in 2024. "These are two major spectroscopic surveys that will cover essentially the whole sky and take spectra of literally tens of millions of stars," said Jeffries. "They will produce lithium equivalent widths, temperatures, rotation rates and measurements of magnetic activity. We then hope to provide the age, which is an essential part of the whole raison d'etre for doing these surveys, which is to try and reconstruct the star-formation history of various populations of stars in the galaxy." Weaver hopes to eventually expand EAGLES even further by including age indicators besides lithium abundance. Some options include barium abundance, magnetic activity and stellar rotation rates. "You can put in as much or as little data as you have, and the model will do its magic," said Weaver. "So, at the minute, we're just expanding those additional age indicators." The EAGLES neural network does have one significant limitation, however. It can only accurately measure the ages of stars up to about 6 billion years old, which is the age of the oldest cluster the algorithm was trained on. Furthermore, the oldest stars in the universe tend to all have the same amount of lithium. "You wouldn’t get an age discriminator out of lithium for the oldest stars," said Jeffries. "It works best for young stars." This means EAGLES cannot be used to measure the age of controversial stars such as the Methuselah Star, which according to some studies, has an age older than the 13.8 billion years of the universe – although more recent studies have discredited such a notion and revised its age to less than the age of the universe. The research was presented at the National Astronomy Meeting in Cardiff at the beginning of July, and a paper published in Monthly Notices of the Royal Astronomical Society.
Space Technology
NASA’s Lunar Trailblazer is nearing completion now that its second and final cutting-edge science instrument has been added to the small spacecraft. Built by the University of Oxford in England and contributed by the UK Space Agency, the Lunar Thermal Mapper (LTM) joins the High-resolution Volatiles and Minerals Moon Mapper (HVM3), which was integrated with the spacecraft late last year. Together, the instruments will enable scientists to determine the abundance, location, and form of the Moon’s water. Led by Caltech in Pasadena, California, Lunar Trailblazer has a mass of about 440 pounds (200 kilograms) and measures only 11.5 feet (3.5 meters) wide with its solar panels fully deployed. The small satellite will rely on the LTM instrument to gather temperature data that will reveal the thermal properties of the lunar surface and the composition of silicate rocks and soils. The HVM3 imaging spectrometer, which was built by NASA’s Jet Propulsion Laboratory in Southern California, will detect and map the form, abundance, and locations of water in the same regions as the LTM instrument. “Lunar exploration is an international endeavor, and Lunar Trailblazer embodies that spirit with the University of Oxford’s and UK Space Agency’s contribution to the mission,” said Bethany Ehlmann, the mission’s principal investigator at Caltech. “With the combined power of both of these sophisticated instruments, we can better understand where and why water is on the Moon and support the next era of Moon exploration.” Launching before the Artemis program’s human landings, Lunar Trailblazer will return information about the Moon’s water, providing maps to guide future robotic and human explorers. Lunar water could be used in a variety of ways, from purifying it as drinking water to processing it for fuel and breathable oxygen. “The Lunar Trailblazer mission will improve our understanding of our natural satellite and how we could harness its resources to support exploration in the future,” said Libby Jackson, Head of Space Exploration at the UK Space Agency. “Backing missions and capabilities that will drive opportunities for humanity to venture deeper into space is one of our priorities, so it’s exciting to see the LTM instrument ready for launch.” Lunar Trailblazer was selected by NASA’s SIMPLEx (Small Innovative Missions for Planetary Exploration) program in 2019, and the spacecraft will launch as a secondary payload on the second Intuitive Machines robotic lunar lander mission, called IM-2. That launch, which will also carry NASA’s Polar Resources Ice Mining Experiment-1 subsurface ice drill, is expected no earlier than early 2024. Lunar Water Cycle When Lunar Trailblazer arrives in orbit around the Moon, it will use HVM3 to map the spectral fingerprints – or wavelengths of reflected sunlight – of the different forms of water over the lunar landscape. LTM will scan those mapped regions at the same time to form an image that can be used to characterize the temperature of the surface. By measuring the same locations at different times of day, Lunar Trailblazer will determine if the amount of water changes on this airless body. It is thought that some water molecules might be locked inside lunar rock and regolith (broken rock and dust), particularly those containing silicates, which are the most abundant mineral on the Moon. Other water molecules may move and settle for short periods as frost in cold shadows. As the Sun changes position in the sky during the lunar day, the shadows move. This causes the ice to sublimate, transforming into vapor without passing through a liquid phase. As the water molecules move in the Moon’s extremely thin atmosphere to other cold places, they can settle once more as a frost. The most likely locations to hold water ice in significant quantities are the always-cold permanently shadowed craters at the lunar poles, which are key targets for science and exploration. “LTM precisely maps the surface temperature of the Moon while the HVM3 instrument looks for the spectral signature of water molecules,” said Neil Bowles, instrument scientist for LTM at the University of Oxford. “Combining the measurements from both instruments allows us to understand how surface temperature affects water, improving our knowledge of the presence and distribution of these molecules on the Moon.” LTM will provide maps of lunar surface temperature from about minus 265 degrees to 266 Fahrenheit (minus 165 degrees to 130 Celsius) using four broadband infrared channels. The instrument will scan the lunar surface to form a multispectral image as the spacecraft orbits above. At the same time, 11 narrow infrared channels also map small variations in the composition of silicate minerals that make up the rocks and regolith of the Moon’s surface, providing more information about what the lunar surface is made of and how this may influence the amount of water present. Lunar Trailblazer is undergoing final assembly and testing at Lockheed Martin Space in Littleton, Colorado, and the spacecraft recently completed thermal vacuum chamber testing that simulates the harsh environment of space. Now, with both instruments integrated with the spacecraft and undergoing final system-level testing, Lunar Trailblazer is approaching readiness to ship to Florida for final launch preparations. More About the Mission Lunar Trailblazer is managed by JPL and its science investigation and mission operations are led by Caltech. Managed for NASA by Caltech, JPL also provides system engineering, mission assurance, the HVM3 instrument, as well as mission design and navigation. Lockheed Martin Space provides the spacecraft and integrates the flight system, under contract with Caltech. SIMPLEx mission investigations are managed by the Planetary Missions Program Office at NASA’s Marshall Space Flight Center in Huntsville, Alabama, as part of the Discovery Program at NASA Headquarters in Washington. The program conducts space science investigations in the Planetary Science Division of NASA’s Science Mission Directorate at NASA Headquarters. For more information about Lunar Trailblazer, visit:
Space Technology
A prototype satellite has become one of the brightest objects in the night sky, and it may soon be accompanied by dozens more. An tracking the BlueWalker 3 satellite, launched in September 2022 by AST SpaceMobile, found that it is at times brighter than all but a handful of stars and planets that can be seen from Earth. The findings published in the journal Nature highlight a fast-escalating concern among astronomers, who have warned that the influx of private space ventures in low-Earth orbit could alter our view of the night sky and interfere with research. Researchers with the International Astronomical Union’s Center for the Protection of the Dark and Quiet Sky from Satellite Constellation Interference (CPS) observed BlueWalker 3 over the course of 130 days. BlueWalker 3’s antenna array measures just shy of 700 square feet, making it the largest yet for a commercial satellite in low-Earth orbit. That huge array reflects sunlight and after it unfurled, its brightness spiked. The effect isn’t constant, but instead fluctuates depending on factors like the satellite’s position relative to the sun, and the viewing angle. The CPS team observed it from sites in Chile, the US, Mexico, New Zealand, the Netherlands, and Morocco. “These results demonstrate a continuing trend towards larger, brighter commercial satellites, which is of particular concern given the plans to launch many more in the coming years,” said Siegfried Eggl, one of the co-authors of the study. “While these satellites can play a role in improving communications, it is imperative that their disruptions of scientific observations are minimized.” AST SpaceMobile eventually plans to deploy a fleet of roughly 100 cellular broadband satellites based on the BlueWalker 3 design. SpaceX, whose thousands of Starlink satellites have repeatedly come under scrutiny for their potential impact on the night sky, has experimented with dark coatings to cut down on the amount of reflected light, to limited success. For astronomers, to whom it poses a growing headache, it's not enough. Stations observing from the ground will need to develop satellite avoidance strategies to work around these artificial constellations, the researchers note in the paper. And, visibility isn’t the only problem. Commercial satellites, including BlueWalker 3, flooding low-Earth orbit also threaten to interfere with radio astronomy. A separate and published earlier this year found Starlink satellites are leaking “unintended electromagnetic radiation” that could disrupt radio telescope observations.
Space Technology
Last week, in a spinning tank beneath the University of Arizona's football stadium, an oven kicked to life. The oven, at the University of Arizona's Richard F. Caris Mirror Lab, began to heat a 20-ton, 27.6-foot-wide (8.4 meters) pool of optical glass to 2,130 degrees Fahrenheit (1,165 degrees Celsius), in the first steps of manufacturing a telescope mirror. The oven's present pastry is the seventh and final mirror of the Giant Magellan Telescope, itself under construction in the mountains of northern Chile. The telescope's crown jewel will be a seven-segment mirror. When all seven pieces are in place, they will work together as a single light-collecting surface 80 feet (24.5 m) across. Each of those mirrors must be of the highest quality, and that takes time. This last mirror will take four months to cool. After that, technicians will begin grinding and polishing its surface to an astronomically precise finish — perfect to within one one-thousandth the width of a human hair. The entire process, from baking to completion, will take four years. Afterward, the mirror segment will journey down to Chile by boat to join its six counterparts. One of those six is currently serving as a guinea pig to test a prototype of the telescope's eventual support structure. Astronomers expect to open the Giant Magellan Telescope's supersharp eye on the universe by the end of this decade. "The combination of light-gathering power, efficiency, and image resolution will enable us to make new discoveries across all fields of astronomy," Rebecca Bernstein, the telescope's chief scientist, said in a statement. "We will have a unique combination of capabilities for studying planets at high spatial and spectral resolution, both of which are key to determining if a planet has a rocky composition like our Earth, if it contains liquid water and if its atmosphere contains the right combination of molecules to indicate the presence of life," she added.
Space Technology
Perseid meteor shower: All you need to know in 2023 The Perseid meteor shower always peak around August 11, 12 and 13. In 2023, they peak right before a new moon. Early to mid-August meteors … the Perseids Predicted peak: is predicted** for August 13, 2023, at 7:58 UTC. When to watch: The moon will be a waning crescent and 10% illuminated during 2023’s peak of the Perseid meteor shower. This shower rises to a peak gradually, then falls off rapidly. And Perseid meteors tend to strengthen in number as late night deepens into the wee hours of the morning. The shower is often best before dawn. In 2023, the moon will be in the morning sky from early to mid-August but growing fainter each day before the peak. Duration of shower: July 14 to September 1. Radiant: The radiant rises in the middle of the night and is highest at dawn. See chart below. Nearest moon phase: New moon falls at 9:38 UTC on August 16. There will be a waning crescent moon up during the Perseid’s peak in 2023. But the thin crescent moon will not be too bright. And you might even enjoy the waning crescent moon as you watch for the Perseids in 2023. (and the Delta Aquariids). Expected meteors at peak, under ideal conditions: Under a dark sky with no moon, skywatchers frequently report 90 meteors per hour, or more. In 2023, the waning crescent moon will not interfere with the meteor shower. Note: The August Perseid meteor shower is rich and steady, from early August through the peak. Plus, the Perseid meteors are colorful. And they frequently leave persistent trains. All of these factors make the Perseid shower perhaps the most beloved meteor shower for the Northern Hemisphere. Perseid meteor shower radiant point Around the peak mornings, if you trace all the Perseid meteors backward, they seem to come from the constellation Perseus, near the famous Double Cluster. Hence, the meteor shower is named in honor of the constellation Perseus the Hero. Of course, there’s no real connection between the meteor shower radiant and the constellation Perseus. The stars in Perseus are many light-years distant, while these meteors burn up about 60 miles (100 km) above the Earth’s surface. The Perseids’ parent comet The parent comet responsible for the Perseid meteor shower is a rather large comet called 109P/Swift-Tuttle. The comet orbits the sun approximately every 133 years. Lewis Swift of Marathon, New York, visually discovered it on July 16, 1862, using an 11-centimeter (4.3-inch refractor lens) telescope. He did not report it immediately, believing that he was observing Comet Schmidt, which was found two weeks prior. Then, three days later, Horace Tuttle picked it up from Harvard Observatory. Scientists calculated that the comet would return in 120 years. That is, that we would see it again in 1982. So, 1982 came and went. And the comet didn’t show up. Oops! It was back to the drawing board, and this time, the appearance of a comet observed in 1737 was considered a possible early appearance of the comet. Now, the orbital period was more like 130 years. Brian Marsden published new orbital elements and an ephemeris as to where to find it, for its 1992 return. In the 1980s, many of us visual comet hunters would, from time to time, cover the part of the sky where the incoming comet was supposed to appear. The 1991 outburst of Perseid meteors indicated that the comet was probably on its way back. Another meteor outburst in 1992 seemed to confirm that. On September 26, 1992, Tsuruhiko Kiuchi, an amateur astronomer and comet hunter, picked up the comet in the evening sky, just north of the bowl of the Big Dipper. Knowing where to look, I observed this comet 16 hours later and made a brightness estimate five times brighter than the original report. Others then confirmed this. Later, Gary Kronk suggested that the comets observed in 69 CE and 188 BCE were also appearances of this comet, a theory later confirmed. Do Perseid meteors ever hit the ground? Meteors that hit the ground intact are called meteorites. But few – if any – meteors in annual showers become meteorites. That’s primarily because of the flimsy nature of cometary debris. Comets are made of ices. Most meteorites, on the other hand, are the remains of rocky or metallic asteroids. In ancient Greek star lore, Perseus is the son of the god Zeus and the mortal Danaë. It was said that the Perseid shower commemorates the time when Zeus visited Danaë, the mother of Perseus, in a shower of gold. So think of the ephemeral nature of meteors in meteor showers, as you stand outside watching for Perseids in 2023. Most meteors strike Earth’s atmosphere unseen. You can consider any Perseid meteor you do see in 2023 as there for your viewing pleasure! By the way, 2021 was a fantastic year for the Perseids: Perseid photos 2021: A week of shooting stars A few Perseid meteor shower photos from EarthSky’s community Video from the 2022 Perseids Bottom line: The 2023 Perseid meteor shower should produce the most meteors in the predawn hours of August 11, 12 and 13. And a thin waning crescent moon shouldn’t interfere with seeing the meteors. Here’s how to get the most from this year’s shower. **Predicted peak times and dates for 2023 meteor showers are from the American Meteor Society. Note that meteor shower peak times can vary.
Space Technology
Nearly three years after the rapid unplanned disassembly of the Arecibo radio telescope, we finally have a culprit in the collapse: bad sockets. In case you somehow missed it, back in 2020 we started getting ominous reports that the cables supporting the 900-ton instrument platform above the 300-meter primary reflector of what was at the time the world’s largest radio telescope were slowly coming undone. From the first sign of problems in August, when the first broken cable smashed a hole in the reflector, to the failure of a second cable in November, it surely seemed like Arecibo’s days were numbered, and that it would fall victim to all the other bad luck we seemed to be rapidly accruing in that fateful year. The inevitable finally happened on December 1, when over-stressed cables on support tower four finally gave way, sending the platform on a graceful swing into the side of the natural depression that cradled the reflector, damaging the telescope beyond all hope of repair. The long run-up to the telescope’s final act had a silver lining in that it provided engineers and scientists with a chance to carefully observe the failure in real-time. So there was no real mystery as to what happened, at least from a big-picture perspective. But one always wants to know the fine-scale details of such failures, a task which fell to forensic investigation firm Thornton Tomasetti. They enlisted the help of the Columbia University Strength of Materials lab, which sent pieces of the failed cable to the Oak Ridge National Laboratory’s High Flux Isotope reactor for neutron imaging, which is like an X-ray study but uses streams of neutrons that interact with the material’s nuclei rather than their electrons. The full report (PDF) reveals five proximate causes for the collapse, chief of which is “[T]he manual and inconsistent splay of the wires during cable socketing,” which we take to mean that the individual strands of the cables were not spread out correctly before the molten zinc “spelter socket” was molded around them. The resulting shear stress caused the zinc to slowly flow around the cable strands, letting them slip out of the surrounding steel socket and — well, you can watch the rest below for yourself. As is usually the case with such failures, there are multiple causes, all of which are covered in the 300+ page report. But being able to pin the bulk of the failure on a single, easily understood — and easily addressed — defect is comforting, in a way. It’s cold comfort to astronomers and Arecibo staff, perhaps, but at least it’s a lesson that might prevent future failures of cable-supported structures. [via New Atlas]
Space Technology
Collisions between black holes can launch newly melded cosmic sinkholes at speeds up to nearly one-tenth the speed of light, researchers report August 18 in Physical Review Letters. Moving that fast — about 28,500 kilometers per second — it would take about 13 seconds to complete the average trip from Earth to the moon. The findings may help researchers figure out how much energy can get released when black holes converge. Black holes merge after coming so close together that they become ensnared in each other’s gravitational pull. The couple enters a cosmic waltz, spiraling ever tighter and stirring up ripples in spacetime known as gravitational waves (SN: 6/28/23; SN: 1/21/21). If the pair’s convergence blows gravitational waves preferentially in one direction, then the emerging black hole will recoil in the opposite direction at high speed, like kickback from a gun (SN: 4/25/22). Previous research had indicated the blowback from such twisting mergers could hurl black holes at up to about 5,000 kilometers per second. For the new study, astrophysicists Carlos Lousto and James Healy, of the Rochester Institute of Technology in New York, probed black hole interactions that were more head-on, akin to particles zipping towards each other in a particle accelerator. The duo simulated more than 1,300 approaches. These included close flybys, direct collisions and everything in between. The mashups that almost didn’t happen — grazing collisions that still resulted in merging — produced the fastest recoiling black holes, the researchers found. Such collisions might be possible with three or more black holes, Lousto says. “Two black holes orbiting in opposite directions around a larger black hole might eventually collide,” he explains, dispatching their descendant with the quickest of quick kicks.
Space Technology
On Sept. 12, a mirror-walled box arrived in the clean room of Planet Labs in San Francisco. This box contains a spectrometer, designed specifically to observe carbon dioxide and methane on Earth's surface. Forged at NASA’s Jet Propulsion Laboratory (JPL) further south in California, this spectrometer’s stop in San Francisco will see it mounted onto a satellite called Tanager. That satellite, if all goes according to plan, should launch in 2024. The nonprofit Carbon Mapper hopes to use Tanager to pinpoint greenhouse gas "super-emitters" on our planet. This newly-arrived spectrometer is a key component of that mission. This device is designed to observe infrared light reflected from Earth's surface, then separate that light into its spectrum. Different gases in Earth's atmosphere each absorb different wavelengths of light, leaving characteristic gaps in the spectrum and allowing observers to reconstruct what gases were present at a certain point. Before sending the spectrometer north, JPL staff tested the mechanism's ability to perform this duty. Inside a vacuum chamber, scientists placed a methane sample in clear view of the spectrometer. And, by JPL's account, the spectrometer succeeded. "We are thrilled to see the exceptional quality of the methane spectral signature recorded," Robert Green, an instrument scientist at JPL, said in a statement. "This bodes well for the space measurement soon to follow." Carbon Mapper — a collaboration between JPL, Planet, the California Air Resources Board, Rocky Mountain Institute, Arizona State University, and the University of Arizona — already launched EMIT, an instrument aboard the International Space Station (ISS) that monitors mineral dust blown from Earth's deserts. Eventually, this spectrometer will join it from an orbit that wraps around Earth's poles.
Space Technology
Beyond Bennu: How OSIRIS-REx is helping scientists study the sonic signature of meteoroids In the high desert of Nevada, Elizabeth Silber watched NASA's Sample Return Capsule from OSIRIS-REx descend into Earth's atmosphere on Sunday, but unlike most scientists, she wasn't there for the asteroid rocks. Silber, a physicist at Sandia National Laboratories, is working with researchers from Sandia and Los Alamos national laboratories, the Defense Threat Reduction Agency, TDA Research Inc., the Jet Propulsion Laboratory, the University of Hawaii and the University of Oklahoma in a campaign to record and characterize the infrasound and seismic waves generated by the capsule as it moved through Earth's atmosphere at hypersonic speed, about 26,000 miles per hour. This was the largest observational campaign of any hypersonic event in history, and Silber hopes the data will improve scientists' ability to use infrasound to detect meteoroids and other objects moving at hypersonic speeds. Scientists currently use infrasound, a low-frequency sound wave that is generally inaudible to humans, to detect and observe volcanic activity, earthquakes and explosions. Silber said infrasound can also be observed when meteoroids enter Earth's atmosphere, but atmospheric conditions like wind can distort the signal, and there's usually relatively little information available about the incoming meteoroid to help with data analysis. "The OSIRIS-REx capsule is the perfect candidate for studying a hypersonic event because we know everything about it—the entry angle, velocity, spin rate, size and mass—and we can use that information to calibrate our models and test our sensors' abilities," Silber said. "Because the capsule was traveling faster than the speed of sound, it generated a shockwave. As the shockwave propagated away from the capsule, it turned into infrasound waves that could be detected." The multi-agency team launched four solar balloons and two weather balloons equipped with microbarometers in Nevada and had ground-based sensors in multiple locations. "These are tiny little sensors that measure minute changes in air pressure," Silber said. "Infrasound is a pressure wave, and when we group sensors together in an array, we can determine the direction the infrasound is traveling from." Silber said the group had an unprecedented number of sensors recording data, including 45 single sensors, one large rectangular array with 200 sensors, and three arrays composed of four sensors in a triangle formation. At first, the team checked to see how many sensors detected the signal. Back home in the lab, Silber and her colleagues will conduct a more extensive study. "We want to determine where along the trajectory of the capsule the shockwave came from," Silber said. "The wave will be a continuous thing along the trajectory, so the question will be where exactly did that signal originate from? From a certain altitude? From different parts of the trail?" The team plans to compare signals recorded from different locations in Nevada and Utah to see if they point to the same origination spot along the capsule's trail. Because the capsule's speed will change as it plunges toward the surface, moving from hypersonic to supersonic to transonic, the team will also be able to study all stages of flight. "Moreover, we will study how strong acoustic waves propagate, test how well our instruments can capture signals and study the effects of the atmosphere on infrasound waves," Silber said. "All this will enhance our knowledge and ability to use infrasound to detect meteoroids and artificial objects with infrasound." In preparation for this campaign, Silber, Daniel Bowman and Sarah Albert published a paper in Atmosphere reviewing past infrasound and seismic observation studies from the four other sample return missions that have occurred since the end of NASA's Apollo missions and summarizes their utility in characterizing the flight of meteoroids through Earth's atmosphere. Silber is also leading a separate but similar Laboratory Directed Research and Development project to investigate if infrasound can be used to determine the altitude and speed of bolides—bright, exploding meteoroids—in situations when other types of sensors don't provide adequate data. More information: Elizabeth A. Silber et al, A Review of Infrasound and Seismic Observations of Sample Return Capsules Since the End of the Apollo Era in Anticipation of the OSIRIS-REx Arrival, Atmosphere (2023). DOI: 10.3390/atmos14101473 Provided by Sandia National Laboratories
Space Technology
Two consecutive powerful solar flares have affected terrestrial communications and provided another fresh signal that the peak of our star’s activity is approaching. The U.S. National Oceanic and Atmospheric Administration’s (NOAA) Space Weather Prediction Center (SWPC) explained that these flare “likely caused the degradation or total loss of communications” in the North American region, which at the time was facing the Sun. At midnight on Saturday night into Sunday, NASA detected a “strong” flare that was classified as type X, the most severe, which it imaged through its Solar Dynamics Observatory. This flare was compounded when two powerful coronal mass ejections merged as they detached from the Sun, as reported by SpaceWeather, resulting in an even more powerful phenomenon. A new model from the SWPC shows that the second, faster ejection outpaced and cannibalized the first, potentially turning the sum of the two into a geomagnetic storm that was forecast to reach Earth on August 8. On Monday August 7, NASA captured another strong solar flare. “The image shows a subset of extreme ultraviolet light that highlights the extremely hot material in flares,” the U.S. space agency said in a statement. Meanwhile, the NOAA has detailed that a flare affected the Earth in an event that “peaked at 20:46 UTC (4:46 pm EDT) on 7 Aug and likely caused HF radio communication degradation or completed loss on the sunlit side of Earth during the flare’s duration.” The Sun emitted a strong solar flare on Aug. 7, 2023, peaking at 4:46 pm EDT. NASA’s Solar Dynamics Observatory captured an image of the event, which was classified as X1.5. https://t.co/MEYYmrSDkI pic.twitter.com/0WCotB48eR— NASA Sun & Space (@NASASun) August 7, 2023 North America and the Pacific were affected by the blackout, as it began around 20:37 UTC over the west coast of Mexico and ended at approximately 21:51 UTC over the east coast of Hawaii. According to the SWPC, the event was graded as R3, that is, a “strong blackout” on a scale from R1 (minor) to R5 (extreme). This second flare is also of type X, and of similar strength to the previous one: X1.5 and X1.6, respectively. The categories depend on the amount of energy released: there are five, with X being the maximum. NASA explains that the additional number provides more information about its strength and can reach a maximum of 10. In both cases, the flare emerged from the same sunspot, cataloged as region 3386. Although sunspots are not the same as flares, there is a relationship between the two solar phenomena. More sunspots mean “more activity and a higher probability of flares,” Consuelo Cid Tortuero, a senior scientist at the Spanish National Service of Space Meteorology, recently explained. Correction on the month under timing in this graphic. The flare peaked as an R3 event on 7 Aug at 4:46pm EDT. The flare was long duration and officially ended at 5:18pm EDT, but remained above R1 levels until 6:44pm EDT on 7 Aug. pic.twitter.com/4WG0OMsMiu— NOAA Space Weather (@NWSSWPC) August 8, 2023 Solar flares are powerful bursts of energy, NASA explains, and can affect radio communications, power grids, navigation signals and pose risks to spacecraft and astronauts. The phase known as the maximum of the solar peak was forecast to arrive in 2025, but these recent phenomena point to the solar maximum — denominated Solar Cycle 25 — coming in late 2023 or early 2024, which would be a “termination event,” according to specialists. The phenomenon occurs when an 11-year solar cycle ends abruptly, changing the star’s polarity, and begins again with greater intensity, causing geomagnetic storms that hit the Earth causing blackouts, but also spectacular aurora borealis in unexpected latitudes. In June, the NOAA explained that the solar cycle had accelerated more quickly than scientists had predicted, producing more sunspots and flares than experts had forecast. As such, these solar events will continue to increase as our star approaches solar maximum. “Though we are seeing increased activity on the Sun, we expect this solar cycle to be average compared to solar cycles in the past century,” the NOAA said in a statement. Sign up for our weekly newsletter to get more English-language news coverage from EL PAÍS USA Edition
Space Technology
AI is now taking a larger step into the world of astronomy. News recently broke of a new supernova, which was discovered by AI. Astronomers at Northwestern University led the collaboration and developed the world’s first AI-powered, fully automatic supernova detection system (via Gizmodo). This entire process could help streamline future studies of exploding stars. Before the creation of this tool, scientists could rely on automated systems and human verification methods. But, with the assistance of AI, they’re finally able to let the machines do the heavy lifting, not only detecting the supernova but also finding out whether or not it actually is a supernova. The AI-powered supernova detection system is called the Bright Transient Survey Bot or BTSbot, and if it continues to be successful, it could cut out the need for the human middle-man in the process completely, letting astronomers focus their attention on other things. To help it learn, researchers fed the BTSbot over 1.4 million images from 16,000 different astronomical sources. This allowed the machine learning algorithm that powers the system to learn how to interpret supernovas and detect them across the universe. Equipped with the training information, the AI-powered detection system was put to work, and after a while, it eventually identified a supernova candidate. This possible stellar explosion is believed to have come from a white dwarf star that fully exploded. The AI detected the supernova and automatically shared its findings with the astronomical community, removing the human part of the equation and streamlining the process. It’s been a huge success so far, and it could help us further identify supernovas going forward. But why remove humans from the loop? Adam Miller, an assistant professor of physics and astronomy at Northwestern, says that “removing humans from the loop provides more time for the research team to analyze their observations and develop new hypotheses to explain the origin of the cosmic explosions that we observe.”
Space Technology
The moon passed through part of Earth’s shadow in a partial lunar eclipse visible to potentially millions of stargazers across the Eastern Hemisphere on Saturday, offering an early skywatching treat days before Halloween. The partial lunar eclipse of Oct. 28, the last of four eclipses of 2023 - two each of the moon and sun - occurred during October's Full Hunter's Moon, offering the spooky sight of part of the moon disappearing as it was engulfed in the darkness of Earth's shadow. The lunar eclipse was only from the night side of Earth as our planet moved between the moon and sun. Skywatchers with clear skies could see the event from countries across Europe, Asia, Africa and parts of Australia. Some observers in select states in the U.S., like New York, Alaska, and North Carolina, were also able to catch the end stages of the eclipse. For everyone else, several livestream webcasts of the lunar eclipse showed online views from TimeandDate.com as well as from Ceccano, Italy by the Virtual Telescope Project. TimeandDate.com captured stunning video of the entire lunar eclipse, with telescopes spread across three continents in regions like Bergen, Norway, Dubai, United Arab Emirates and Perth, Australia. Near the end of the eclipse, the telescope from Norway captured a truly spectacular sight: the fading lunar eclipse with the brilliant planet Jupiter in to the upper right of the moon. In Dubai, nearly 200 spectators gathered at the Al Thuraya Astronomy center in Mushrif Park to watch the lunar eclipse with the Dubai Astronomy Group, which webcast its views in TimeandDate.com's livestream. "Lots of children came to see this event. We are very excited and nobody is on their phone, which is incredible, just everyone just looking up at the moon," Khadijah Ahmad, operations manager of the Dubai Astronomy Group, said during the livestream. "We have about eight telescopes set up downstairs and the public are all over these telescopes observing and taking pictures." Cloudy weather in London, England wasn't enough to ruin the view for one skywatcher. "My clouds parted just in time for the climax at at 9:14 pm in London," wrote one observer, who goes by Epiphany and @FunkyAppleTree on X, formerly known Twitter, while sharing stunning photos. "Thrilled." Thank you, @metoffice My clouds parted just in time for the climax at 9:14pm in London. Thrilled. @StormHour @skyatnightmag #PartialLunarEclipse #LunarEclipse pic.twitter.com/9msYmocI5wOctober 28, 2023 Another observer in Delhi, India, was amazed as well. "Omg. 1st time in [my] life I tried & luckily watched [a] very clear lunar eclipse in Delhi," wrote Shweta @imshwetta on X. "Partial moon covered in black." Omg 1st time in life I tried & luckily watched very clear #LunarEclipse in #DelhiPartial moon covered in black#ChandraGrahan#Moon #LunarEclipse2023 pic.twitter.com/TQowehIBbAOctober 28, 2023 Here are a some more amazing views from eclipse watchers on X who tracked the lunar eclipse from around the world. LOOK: The maximum of the October 29 Partial #LunarEclipse taken from Cebu City, Philippines at 4:14 AM PhST. Thank you so much, Alma Alfafara for sharing!Sony a7iisigma 100-400mm600mm aps-c mode pic.twitter.com/X0Wdcvalv4October 28, 2023 In Italy, photojournalist Lorenzo Di Cola of NurPhoto and Getty Images captured this view of the lunar eclipse from L'Aquila, Italy, showing the Earth's shadow on the moon from a different vantage point. Captured the mesmerizing Lunar Eclipse over Mumbai tonight with my mobile! 🌕✨ #LunarEclipse #MumbaiSky #NightPhotography #AstronomyLove pic.twitter.com/AG1rwssnNjOctober 28, 2023 Shock horror! I think the Earth might be round, take that flat-earthers!! 🤭Lunar eclipse post-maximum from the back of my camera! You can clearly see Earth’s CURVED shadow cast onto the #Moon!! #LunarEclipse #MoonHour @MoonHourSocial @DavidBflower pic.twitter.com/SDxEP16kTFOctober 28, 2023 Saturday's partial lunar eclipse began at 2:01 p.m. EDT (1901 GMT) and was expected to last about 4.5 hours, ending at 6:26 p.m. EDT (2226 GMT). It was a partial eclipse because at the time of the event, the moon only partially moved into the darkest part of Earth's shadow — called the umbra. This was the last lunar eclipse of 2023. The next one will occur on March 24, 2024, but will be be less impressive, with the moon passing only through the Earth's outer shadow, which scientists call the penumbra. That eclipse will be visible from North America and is a preview for a truly spectacular total solar eclipse on April 8, 2024, which will be visible from Mexico, the United States and Canada. Editor's Note: If you snap an image of the Hunter's Moon and would like to share it with Space.com's readers, send your photo(s), comments, and your name and location to [email protected].
Space Technology
The blazing surface of the sun froths with an extremely hot electrically charged gas called plasma. The temperature at the edge of this cosmic furnace runs at about 5,500 degrees Celsius, but here’s the real puzzle: Somehow the sun’s atmosphere, which surrounds that surface like a halo, is 150 times hotter.“Why is the corona 1 million degrees while the photosphere is at 5,500?” asks Yannis Zouganelis, deputy project scientist for the European Space Agency’s Solar Orbiter probe. “The main problem is, we have many ideas, many theories, but we have no real measurements.” Until now. Last year, the Solar Orbiter swooped in for a close-up. It examined the corona from a distance of 140 million kilometers—close enough to get good readings, but far enough away to not melt or damage its cameras. Even more crucially, thanks to some astronomical choreography, engineers coordinated that maneuver with a flyby from NASA’s Parker Solar Probe to make the first joint measurements of the corona. Together, they pulled off observations neither probe could have made on its own, Zouganelis says. Their findings have just appeared in a new study in the Astrophysical Journal Letters.The ESA’s Solar Orbiter carries a coronagraph, an instrument called Metis, developed by scientists at the Italian National Institute for Astrophysics. It blocks out light from the sun’s surface, allowing the probe to take photos only of the corona. Imaging the corona in detail at optical and ultraviolet wavelengths allows scientists to study the dynamics within that solar atmosphere and better understand the heating rate within it.NASA’s probe maneuvered much closer, about 9 million kilometers from the sun. That probe lacks cameras, but it can survive inside the sun’s atmosphere and make measurements of its plasma and magnetic fields. That enables scientists to track how heat and energy move about the corona.By using both spacecraft together, researchers on the two teams had the chance to combine simultaneous measurements and images. Most significantly, they determined that turbulence within the sun’s plasma contributes to the corona’s heat—although they’re not yet sure how much. Plasma is essentially a gas made of hot charged particles that emanate from the sun’s surface. As it roils towards the corona, it transmits heat energy outwards, a bit like how the flames of a fire dissipate energy as they flicker. “Combining the data from the two spacecraft, while they are aligned but far apart, gives us the evolution of the plasma from one spacecraft [reading] to the next. Having that information is so crucial,” says Nour Raouafi, the Parker Solar Probe project scientist, who was not involved in the research.The new data also gives insight into another enigma that has stymied astrophysicists: How the solar wind accelerates to supersonic speeds. This wind is made up of charged particles flying along the sun’s magnetic field lines, which seem to be propelled into the solar system by small, intermittent, explosive jets at the base of the corona. Zouganelis and his ESA colleagues think that turbulence higher up the corona is likely involved in speeding it up, too. “They all work together to make the solar wind the way it is,” Raouafi says.Scientists have good reasons to study how the sun works: Behavior below and within the corona affects the formation of solar flares and coronal mass ejections, which can wreak havoc on Earth if they get hurled in our direction. The research is also important to space agencies preparing to send astronauts to the moon, outside the protective bubble of Earth’s magnetic field.The ESA and NASA solar orbiters launched a year apart, and it took some astronomical acrobatics to get them into the right configuration for their tandem measurements, which were planned for June 1, 2022. Both needed to be on the same plane of the sun at the same time. The probes were almost in just the right spots by that date, but Parker was slightly off to one side. Engineers had to roll the Solar Orbiter 45 degrees to get Parker in its field of view.Scientists will get two more chances at this orbital conjunction: once at the end of this year, and again in March 2024. The researchers hope to make additional measurements to learn more about exactly how the sun heats up its atmosphere so much. After that, the Solar Orbiter will drift off the ecliptic plane to begin examining the sun’s poles.International teams of scientists have benefited from working together to share data while both probes are aloft. “If you add to that the 4-meter telescope in Hawaii, all three of them define a golden era for solar physics research,” Raouafi says, referring to the Daniel K. Inouye Solar Telescope on Maui, which has instruments for resolving patterns created by hotter and cooler plasma on the sun’s surface. Scientists can soon add India’s Aditya-L1, a solar probe that launched on September 2, just 10 days after India’s lander touched down on the moon. In a couple of months, Aditya-L1 will begin to study the sun’s magnetic fields and solar storms as it orbits.For solar scientists, this is a rare opportunity to create a detailed portrait of the sun. It is not only the star of our solar system, but the only star in the universe people can image in 3D. At last, they can finally snap the photos they’ve been waiting for.
Space Technology
Study quantifies satellite brightness, challenges ground-based astronomy The ability to have access to the Internet or use a mobile phone anywhere in the world is taken more and more for granted, but the brightness of Internet and telecommunications satellites that enable global communications networks could pose problems for ground-based astronomy. University of Illinois Urbana-Champaign aerospace engineer Siegfried Eggl coordinated an international study confirming recently deployed satellites are as bright as stars seen by the unaided eye. "From our observations, we learned that AST Space Mobile's BlueWalker 3—a constellation prototype satellite featuring a roughly 700 square-foot phased-array antenna—reached a peak brightness of magnitude 0.4, making it one of the brightest objects in the night sky," Eggl said. "Although this is record breaking, the satellite itself is not our only concern. The untracked Launch Vehicle Adapter had an apparent visual magnitude of 5.5, which is also brighter than the International Astronomical Union recommendation of magnitude 7." For comparison, the brightness of the stars we can see with an unaided eye is between minus 1 and 6 magnitude, minus 1 being the brightest. Sirius, the brightest star, is minus 1. Planets like Venus can sometimes be a bit brighter—closer to minus 4, but the faintest stars we can see are roughly magnitude 6. Watch a video showing a starry sky with three satellites: BlueWalker 3 at 19:52:45, 19:52:56, 19:53:18, 19:53:29; Starlink-4781 is visible at 19:52:54 and 19:53:26, leading BlueWalker 3; Starlink-4016 is parallel and slightly behind BlueWalker 3 at 19:53:34. Video courtesy: Marco Langbroek, Delft Technical University. "One might think if there are bright stars, a few more bright satellites won't make a difference. But several companies plan to launch constellations," Eggl said. "For example, Starlink already has permission to launch thousands of satellites, but they'll probably get their full request of tens of thousands granted eventually. "And that's just one constellation of satellites. Europe and China want their own constellations and so does Russia. Just those in the United States being negotiated with the FCC amount to 400,000 satellites being launched in the near future. There are only 1,000 stars you can see with the unaided eye. Adding 400,000 bright satellites that move could completely change the night sky." Eggl is a member of the International Astronomical Union Centre for the Protection of the Dark and Quiet Sky from Satellite Constellation Interference, IAU. "BlueWalker 3 is so bright that most of the big telescopes such as the Rubin Observatory believe it could obliterate large parts of exposures," Eggl said. "They already have to avoid observing Mars and Venus for the same reason, but we know where the planets are so we can dodge them. We cannot accurately predict where all the satellites will be years in advance. Just accepting recurring data loss in multi-billion-dollar observatories is not an option either." He said although satellites won't necessarily damage the telescope's CCDs, or charge-coupled devices, they will still cause data loss from the streaks. Extremely bright satellites could ruin the entire field of view, like trying to stargaze when someone periodically shines a flashlight into your eyes. Eggl said several solutions to the problem are being explored in collaboration with the Laboratory for Advanced Space Systems at Illinois and satellite operators such as SpaceX. "Starlink is looking at making their satellites' surfaces darker, which absorbs more and reflects less visible sunlight. But the absorption generates heat. The satellites then have to emit infrared light which means observations in optical wavelengths don't have as large of a problem, but infrared observations might. And heat is one of the biggest engineering problems that we have in space. So, painting everything black comes with repercussions," he said. Another idea from SpaceX is to make satellites' solar panels more reflective with dielectric mirrors. The mirrors allow the satellites to change the direction of the reflection so that it's not pointing directly at the Earth. "If SpaceX can make the solar panels point in a different direction to avoid glints, or use these mirror tricks, they might solve a lot of the problems we have with the optical flaring of Starlink satellites," Eggl said. "With other providers, it's not quite as easy. AST has gigantic satellites, with hundreds of square feet of electronic phased arrays, that they need to communicate with cell phones on the ground. If they made satellites smaller more of their radio signals would leak out through so-called 'side lobes' potentially affecting radio astronomy sites. Eggl said AST also prefers to keep the satellite pointed toward the surface of the Earth to achieve maximum efficiency. Starlink solutions may not easily translate to AST satellites and new mitigation strategies are needed. "We are trying to work with the space industry, where possible," he said. "We want to solve this together so it's a collaborative effort that everybody can sign onto because that's the fastest route to get things done." The study, "Optical observations of an ultrabright constellation satellite," was written by Sangeetha Nandakumar, Siegfried Eggl, Jeremy Tregloan-Reed, et al. It is published in the journal Nature. DOI:10.1038/s41586-023-06672-7 Ph.D. student Nandakumar analyzed the data for this first international study to be published from the center. Nandakumar works with Jeremy Tregloan-Reed at the Universidad de Atacama in Chile. More information: Sangeetha Nandakumar et al, The high optical brightness of the BlueWalker 3 satellite, Nature (2023). DOI: 10.1038/s41586-023-06672-7 Journal information: Nature
Space Technology