full_name
stringlengths
9
72
url
stringlengths
28
91
description
stringlengths
3
343
readme
stringlengths
1
207k
kaixindelele/ChatSensitiveWords
https://github.com/kaixindelele/ChatSensitiveWords
利用LLM+敏感词库,来自动判别是否涉及敏感词。
# ChatSensitiveWords 利用LLM+敏感词库,来自动判别是否涉及敏感词。已经在[chatpaper](https://chatwithpaper.org)上线! # Motivation: 在[chatpaper](https://chatwithpaper.org)中,我们免费提供了[学术版GPT](https://github.com/binary-husky/gpt_academic)完整功能,这里面有一个隐藏的对话功能,我们希望给大家提供一个学术对话接口,但防不住大家好奇,或者有极少数人故意攻击我们的输出。 因此我们需要紧急上线敏感词检测。尤其是针对政治敏感词和色情敏感词。 # 方案: 1. 传统敏感词库检测是否有敏感词 2. 如果有敏感词,让Chat来判断语义,是否有误杀。 3. 输出敏感词检测(待做) # 使用步骤: 1. pip install -r requirements.txt 2. 替换api key 3. 确认敏感词库路径是否正确。 # 初步效果: <details><summary><code><b>一些有意思的案例截图</b></code></summary> ![GO5YV8RZ$$FOQ) 4 CEBHKG](https://github.com/kaixindelele/ChatSensitiveWords/assets/28528386/1a05c2a1-8572-4744-aa12-6098bb0b0826) ![M MNO8I853}YN RNLCGXNPR](https://github.com/kaixindelele/ChatSensitiveWords/assets/28528386/d46a16af-2b38-4ccb-856b-263b0c5aac00) ![FA LNCEEAC(XT05JVAQUJ~4](https://github.com/kaixindelele/ChatSensitiveWords/assets/28528386/23ca1586-3ed1-418d-8093-20999e9c6c03) ![7JU@}TJ 0)N8Z3{Z_63FE H](https://github.com/kaixindelele/ChatSensitiveWords/assets/28528386/eefd58ff-44e9-439c-8d4b-fbbd9e0ac8e8) </details> # 时间消耗分析: 无敏感词的情况,消耗0.2s左右 有敏感词,消耗0.2+0.8s左右。 欢迎大家在本地测试和评估。 **千万别在我们的学术版上测试!** # 欢迎大家一起玩耍 目前传统敏感词库的检测还没有做好,然后chat的prompt也没有调到最优。 欢迎大家star和pull新功能! # 如果对搭网站的朋友有帮助的话,也欢迎打赏, 感谢认可! ![8F26{U54RLC EUL6(MNEZPA](https://github.com/kaixindelele/ChatSensitiveWords/assets/28528386/1e87e11a-fbe2-4f7d-a754-ce8f596c0a4d) ## 请不要打包卖钱,如果卖钱的话,分我点... # Starchart [![Star History Chart](https://api.star-history.com/svg?repos=kaixindelele/ChatSensitiveWords&type=Date)](https://star-history.com/#kaixindelele/ChatSensitiveWords&Date) ## Contributors <a href="https://github.com/kaixindelele/ChatSensitiveWords/graphs/contributors"> <img src="https://contrib.rocks/image?repo=kaixindelele/ChatSensitiveWords" /> </a> ## 项目引用: Please cite the repo if you use the data or code in this repo. ``` @misc{ChatPaper, author={Yongle Luo}, title = {ChatSensitiveWords: Use LLM and sensitive word library to check sensitive input text.}, year = {2023}, publisher = {GitHub}, journal = {GitHub repository}, howpublished = {\url{https://github.com/kaixindelele/ChatSensitiveWords}}, } ```
Xposed-Modules-Repo/com.unexpected.th
https://github.com/Xposed-Modules-Repo/com.unexpected.th
电报助手
# Telegram Helper - 电报助手 # 注意 - 此项目已经停止维护 --- ![image](https://github.com/Xposed-Modules-Repo/com.unexpected.th/blob/main/1559808980192.jpg)
verytinydever/react-copy-mailto
https://github.com/verytinydever/react-copy-mailto
null
## react-copy-mailto <!-- ALL-CONTRIBUTORS-BADGE:START - Do not remove or modify this section --> [![All Contributors](https://img.shields.io/badge/all_contributors-2-orange.svg?style=flat-square)](#contributors-) <!-- ALL-CONTRIBUTORS-BADGE:END --> [![npm version](https://badge.fury.io/js/react-copy-mailto.svg)](https://badge.fury.io/js/react-copy-mailto) ![npm bundle size](https://img.shields.io/bundlephobia/minzip/react-copy-mailto) A fully customizable React component for copying email from `mailto` links. ## Motivation The one thing we all can agree on that we hate it when the default mail app pops up after clicking on the `mailto` links. Most of the time we just want to copy the email address and that's where this module comes into play. Big shout out to [Kuldar](https://twitter.com/kkuldar) whose tweet [thread](https://twitter.com/kkuldar/status/1270736717939716097) inspired us to build this. ## Demo ![here](https://media.giphy.com/media/MEYN4oO6FZan5wXQ3w/giphy.gif) ## Installation and Usage The easiest way to use this library is to install it via yarn or npm ``` yarn add react-copy-mailto ``` or ``` npm install react-copy-mailto ``` Then just use it in your app: ```jsx import React from "react"; import CopyMailTo from "react-copy-mailto"; const YourComponent = () => ( <div> <CopyMailTo email="[email protected]" /> </div> ); ``` ## Props You can customize almost every aspect of this component using the below props, out of which **email is the only required prop**. | Name | Type | Default | Description | |:-: |--- |--- |--- | | email | string | none | The email to be copied. | | children | ReactNode | null | Use this if you want to use some custom component inside the anchor tag. | | defaultTooltip | string | "Copy email address" | Text shown in the tooltip when the user hovers over the link. | | copiedTooltip | string | "Copied to clipboard!" | Text shown in the tooltip when the user clicks on the link and the text is copied to clipboard. | | containerStyles | style object | none | The styles to be applied to the container. | | tooltipStyles | style object | none | The styles to be applied to the tooltip. | | anchorStyles | style object | none | The styles to be applied to the anchor. | ## Development - Install the dependencies ``` yarn ``` - Run the example on the development server ``` yarn demo:dev ``` ## Contributing [![GitHub issues](https://img.shields.io/github/issues-raw/devfolioco/react-copy-mailto?logo=github)](https://github.com/devfolioco/react-copy-mailto/issues) [![GitHub pull requests](https://img.shields.io/github/issues-pr/devfolioco/react-copy-mailto?logo=git)](https://github.com/devfolioco/react-copy-mailto/pulls) Feel free to open [issues](https://github.com/devfolioco/react-copy-mailto/issues/new/choose) and [pull requests](https://github.com/devfolioco/react-copy-mailto/pulls)! ## License [![NPM](https://img.shields.io/npm/l/react-copy-mailto)](https://github.com/devfolioco/react-copy-mailto/blob/master/LICENSE) ## Contributors ✨ Thanks goes to these wonderful people ([emoji key](https://allcontributors.org/docs/en/emoji-key)): <!-- ALL-CONTRIBUTORS-LIST:START - Do not remove or modify this section --> <!-- prettier-ignore-start --> <!-- markdownlint-disable --> <table> <tr> <td align="center"><a href="http://prateeksurana.me"><img src="https://avatars3.githubusercontent.com/u/21277179?v=4" width="100px;" alt=""/><br /><sub><b>Prateek Surana</b></sub></a><br /><a href="https://github.com/devfolioco/react-copy-mailto/commits?author=prateek3255" title="Code">💻</a> <a href="#design-prateek3255" title="Design">🎨</a> <a href="#content-prateek3255" title="Content">🖋</a> <a href="https://github.com/devfolioco/react-copy-mailto/commits?author=prateek3255" title="Documentation">📖</a></td> <td align="center"><a href="http://ankiiitraj.github.io"><img src="https://avatars2.githubusercontent.com/u/48787278?v=4" width="100px;" alt=""/><br /><sub><b>Ankit Raj</b></sub></a><br /><a href="#tool-ankiiitraj" title="Tools">🔧</a></td> </tr> </table> <!-- markdownlint-enable --> <!-- prettier-ignore-end --> <!-- ALL-CONTRIBUTORS-LIST:END --> This project follows the [all-contributors](https://github.com/all-contributors/all-contributors) specification. Contributions of any kind welcome!
nicholaschiasson/fcidr
https://github.com/nicholaschiasson/fcidr
Fragmented Classless Inter-Domain Routing (FCIDR)
# fcidr [![crates.io](https://img.shields.io/crates/v/fcidr)](https://crates.io/crates/fcidr) Fragmented Classless Inter-Domain Routing (FCIDR) A library exposing a data structure to represent a set of CIDR ranges as well as an interface to compute set operations over CIDRs. This data structure can be applied, for example, in configuring firewalls that *implicitly deny* (AWS Security Groups) using a rule set that explicitly expresses rules for both allow and deny. > **Note** > Currently, only IPv4 is supported. IPv6 support is tracked by [#6](https://github.com/nicholaschiasson/fcidr/issues/6). ## CLI This project also publishes a binary application for use on the command line to support composing chains of set operations on CIDRs by reading from standard input. ### Installation For now, crates.io is the only place this is being distributed. ``` cargo install fcidr ``` ### Usage ``` Fragmented Classless Inter-Domain Routing (FCIDR) Usage: fcidr [CIDR] <COMMAND> Commands: complement Compute the complement of the input CIDR(s) [aliases: !, not] difference Compute the set difference between the input CIDR(s) and another CIDR [aliases: -, exclude, minus] union Compute the set union of the input CIDR(s) and another CIDR [aliases: +, include, plus] help Print this message or the help of the given subcommand(s) Arguments: [CIDR] The input CIDR range and first operand to the computation. If omitted, input is taken from stdin. In this way, multiple computations can be chained together Options: -h, --help Print help -V, --version Print version` ``` ### Example ``` fcidr 10.0.0.0/8 difference 10.0.64.0/20 | fcidr difference 10.0.82.0/24 | fcidr union 10.0.82.74/31 10.0.0.0/18 10.0.80.0/23 10.0.82.74/31 10.0.83.0/24 10.0.84.0/22 10.0.88.0/21 10.0.96.0/19 10.0.128.0/17 10.1.0.0/16 10.2.0.0/15 10.4.0.0/14 10.8.0.0/13 10.16.0.0/12 10.32.0.0/11 10.64.0.0/10 10.128.0.0/9 ``` ``` fcidr 10.0.0.0/8 difference 10.0.64.0/20 | fcidr difference 10.0.82.0/24 | fcidr union 10.0.82.74/31 | fcidr complement 0.0.0.0/5 8.0.0.0/7 10.0.64.0/20 10.0.82.0/26 10.0.82.64/29 10.0.82.72/31 10.0.82.76/30 10.0.82.80/28 10.0.82.96/27 10.0.82.128/25 11.0.0.0/8 12.0.0.0/6 16.0.0.0/4 32.0.0.0/3 64.0.0.0/2 128.0.0.0/1 ``` Alternative concise syntax: ``` fcidr 10.0.0.0/8 + 127.0.0.0/16 | fcidr - 10.64.0.0/16 | fcidr ! 0.0.0.0/5 8.0.0.0/7 10.64.0.0/16 11.0.0.0/8 12.0.0.0/6 16.0.0.0/4 32.0.0.0/3 64.0.0.0/3 96.0.0.0/4 112.0.0.0/5 120.0.0.0/6 124.0.0.0/7 126.0.0.0/8 127.1.0.0/16 127.2.0.0/15 127.4.0.0/14 127.8.0.0/13 127.16.0.0/12 127.32.0.0/11 127.64.0.0/10 127.128.0.0/9 128.0.0.0/1 ```
junhoyeo/react-threads
https://github.com/junhoyeo/react-threads
Embed Static Threads in your React/Next.js application. UI components for Meta's Threads.
# react-threads [![NPM](https://img.shields.io/npm/v/react-threads.svg?style=flat-square&labelColor=black)](https://www.npmjs.com/package/react-threads) [![MIT License](https://img.shields.io/badge/license-MIT-blue?style=flat-square&labelColor=black)](https://github.com/junhoyeo/react-threads/blob/main/license) [![Prettier Code Formatting](https://img.shields.io/badge/code_style-prettier-brightgreen.svg?style=flat-square&labelColor=black)](https://prettier.io) > Embed Static Threads in your React/Next.js application. UI components for Meta's Threads. _Powered by **[junhoyeo/threads-api](https://github.com/junhoyeo/threads-api)**._ [![cover](.github/cover.jpg)](https://react-threads.vercel.app) ## Demo > **Warning**<br/> > Vercel Deployment is currently sometimes unstable. 🏴‍☠️ [![cover](.github/cover-netflix.png)](https://react-threads.vercel.app/CuUoEcbRFma) ## 📦 Setup First, install `react-threads` with it's dependencies. ```bash yarn add next react-threads threads-api yarn add -D tailwindcss postcss autoprefixer ``` Add path to `react-threads` to [content sources](https://tailwindcss.com/docs/content-configuration) of your Tailwind Configuration file(`tailwind.config.js`). ```js // tailwind.config.js /** @type {import('tailwindcss').Config} */ module.exports = { content: [ './src/**/*.{ts,tsx}', // path to `react-threads` 'node_modules/react-threads/**/*', ], }; ``` Set [`images.remotePatterns` to your app's `next.config.js`](https://nextjs.org/docs/pages/api-reference/components/image#remotepatterns). We use `next/image` to proxy images under the hood. ```js // next.config.js /** @type {import('next').NextConfig} */ const nextConfig = { images: { remotePatterns: [ { protocol: 'https', hostname: '**.cdninstagram.com', }, { protocol: 'https', hostname: '**.fbcdn.net', }, ], }, }; module.exports = nextConfig; ``` ## 🚀 Usage ```ts import { GetStaticProps, NextPage } from 'next'; import { BannerCTA, Thread } from 'react-threads'; import { Thread as ThreadPost, ThreadsAPI } from 'threads-api'; const threadsAPI = new ThreadsAPI(); type Props = { threadID: string; thread: ThreadPost; }; export const getStaticProps: GetStaticProps<Props, { threadId: string }> = async (context) => { try { const threadID = context.params?.threadId; if (!threadID) { console.log('[!] Thread ID not provided'); return { notFound: true }; } const postID = threadsAPI.getPostIDfromThreadID(threadID); if (!postID) { console.log( '[!] Post ID not found with provided Thread ID (in threadsAPI.getPostIDfromThreadID):', threadID, ); return { notFound: true }; } const thread = await threadsAPI.getThreads(postID); const { containing_thread } = thread; return { props: { threadID, thread: containing_thread, }, revalidate: 10, }; } catch (err) { console.error('[*] Error fetching Thread', err); throw err; } }; export async function getStaticPaths() { return { paths: [], fallback: true, }; } ``` ## 🏴‍☠️ Useful Building Blocks - Looking for an API client? - **[junhoyeo/threads-api ![](https://img.shields.io/github/stars/junhoyeo%2Fthreads-api?style=social)](https://github.com/junhoyeo/threads-api)** - Using [_Private Git Submodules_](https://github.com/orgs/vercel/discussions/44) when deploying with Vercel? - **[junhoyeo/vercel-submodules ![](https://img.shields.io/github/stars/junhoyeo%2Fvercel-submodules?style=social)](https://github.com/junhoyeo/vercel-submodules)** ## 🏴‍☠️ Inspired from 🤍 - [vercel-labs/react-tweet](https://github.com/vercel-labs/react-tweet) - [transitive-bullshit/react-static-tweets](https://github.com/transitive-bullshit/react-static-tweets) - [zernonia/tweetic](https://github.com/zernonia/tweetic) ## License <p align="center"> <a href="https://github.com/junhoyeo"> <img src="https://github.com/junhoyeo/threads-api/raw/main/.github/labtocat.png" width="256" height="256"> </a> </p> <p align="center"> <strong>MIT © <a href="https://github.com/junhoyeo">Junho Yeo</a></strong> </p> If you find this project intriguing, **please consider starring it(⭐)** or following me on [GitHub](https://github.com/junhoyeo) (I wouldn't say [Threads](https://www.threads.net/@_junhoyeo)). I code 24/7 and ship mind-breaking things on a regular basis, so your support definitely won't be in vain.
surqing/-Autox.js-
https://github.com/surqing/-Autox.js-
通过安装安卓端的autox.js,执行本项目的脚本,实现自动监测大麦,自动演唱会门票
# -Autox.js- 通过安装安卓端的[autox.js](https://github.com/kkevsekk1/AutoX),执行本项目的脚本,实现自动监测大麦,自动抢演唱会门票 ## 实际抢票过程中会遇到其他的错误或者异常信息没有做处理,感兴趣的朋友欢迎继续完善 ### 抢票的难度很大,请不要抱有太大希望,该脚本仅用于学习autoxjs # 食用方法 1. 安装适合你的安卓手机的autoxjs 2. 在autoxjs中新建一个.js脚本文件 3. 将最新版本的本脚本复制进.js文件中 4. 按照脚本中的注释修改脚本 5. 进入到等待抢购的页面 6. 返回脚本,点击执行即可 # 声明 该脚本仅为个人用于学习的项目,请勿用于其他用途!
VolkanSah/The-Code-Interpreter-in-OpenAI-ChatGPT
https://github.com/VolkanSah/The-Code-Interpreter-in-OpenAI-ChatGPT
The code interpreter is a tool developed by OpenAI to execute programming code in an interactive environment. It is capable of running Python code and displaying the results in real-time.
# Exploring the Code Interpreter in OpenAI ChatGPT 4 The code interpreter is an advanced feature of OpenAI's ChatGPT that brings a new level of interactivity to the AI model. It is designed to execute Python code in a sandboxed environment and provide real-time results, making it a powerful tool for a wide range of tasks from mathematical computations to data analysis, from code prototyping to teaching and learning Python programming interactively. While there are certain limitations to its functionality due to security reasons, it opens up a whole new set of possibilities for how users can interact with ChatGPT. ## Table of Contents - [What is the Code Interpreter?](#what-is-the-code-interpreter) - [What is the Code Interpreter used for?](#what-is-the-code-interpreter-used-for) - [How can ChatGPT assist with programming?](#how-can-chatgpt-assist-with-programming) - [What are the limitations?](#what-are-the-limitations) - [What are the benefits?](#what-are-the-benefits) - [Data Storage](#data-storage) - [Detailed Explanation of the Data Storage](#detailed-explanation-of-the-data-storage) - [Working with Images](#working-with-images) - [How to enable Code Interpreter?](settings-ci.png) ## The Code Interpreter in OpenAI ChatGPT ### What is the Code Interpreter? The code interpreter is a tool developed by OpenAI to execute programming code in an interactive environment. It is capable of running Python code and displaying the results in real-time. ### What is the Code Interpreter used for? The code interpreter can be used for a variety of tasks, including: - Performing complex mathematical calculations - Analyzing and visualizing data - Prototyping and debugging Python code - Interactive learning and practicing Python programming ### How can ChatGPT assist with programming? ChatGPT can generate, review, and debug code based on the provided requirements. It can also assist in structuring code and provide suggestions for improvements. Moreover, it can explain complex programming concepts and assist in solving coding problems. ### What are the limitations? While the code interpreter is a powerful tool, it has certain limitations: - It does not have access to the internet. This means it cannot make external requests. - It runs in an isolated environment and does not have access to the operating system or its resources. - Code execution that takes longer than 120 seconds is automatically stopped. - It has access to a special location, '/mnt/data', where it can read and write files. Despite these limitations, the code interpreter is a versatile tool that can greatly assist programmers of all skill levels. ### What are the benefits? The code interpreter offers several benefits: - It provides a safe environment to run code without the risk of affecting the operating system or data. - It allows for real-time interaction with the code, providing immediate feedback. - It can assist in learning Python programming and improving coding skills. - It can handle a variety of tasks, from simple calculations to data analysis and visualization. ## Data Storage The code interpreter has access to a special directory, '/mnt/data', where it can read and write files. This can be used for operations that need to save or load data, like writing logs, saving plots, or loading data for analysis. However, no other locations on the filesystem can be accessed. ## Detailed Explanation of the Data Storage The '/mnt/data' directory is a special storage location that the code interpreter can access to read and write files. This is especially useful for operations that require persistent storage or the exchange of data between different code executions. Here are some ways you can use the '/mnt/data' directory: 1. **Saving and Loading Data Files:** If you're working with data in formats like .csv, .json, .txt, etc., you can read from and write to these files directly in this directory. For instance, to write a list of numbers to a .txt file, you would do: ```python with open('/mnt/data/numbers.txt', 'w') as file: for num in range(10): file.write(str(num) + '\n') ``` To read the file, you would do: ```python with open('/mnt/data/numbers.txt', 'r') as file: numbers = file.readlines() ``` 2. **Storing Logs:** If you're running code that generates logs (like debugging information, progress of a task, etc.), you can write these logs to a file in '/mnt/data'. ```python with open('/mnt/data/log.txt', 'w') as file: file.write('This is a log message.') ``` 3. **Saving Plots and Images:** If you're generating plots or other images with your code, you can save them to '/mnt/data' as .png, .jpg, or other image formats. For instance, if you're using matplotlib to create a plot, you can save it with: ```python import matplotlib.pyplot as plt plt.plot([0, 1, 2, 3, 4], [0, 1, 4, 9, 16]) plt.savefig('/mnt/data/plot.png') ``` You can then download the image file directly from the generated sandbox link. Remember, any file operations need to be done using the '/mnt/data' path. The code interpreter does not have access to any other locations on the filesystem. ## Working with Images With the help of various Python libraries such as PIL (Python Imaging Library), OpenCV, and matplotlib, a variety of operations can be performed on images. Here are some examples: 1. **Displaying Image:** Display an image. ```python from PIL import Image import matplotlib.pyplot as plt # Open the image file img = Image.open('/mnt/data/your_image.jpg') # Display the image plt.imshow(img) plt.axis('off') # Turn off the axis plt.show() ``` 2. **Resizing Image:** Change the size of an image, enlarge or shrink it. ```python # Resize the image img_resized = img.resize((new_width, new_height)) ``` 3. **Rotating or Flipping Image:** Rotate an image or flip it horizontally or vertically. ```python # Rotate the image img_rotated = img.rotate(angle) # Flip the image img_flipped = img.transpose(Image.FLIP_LEFT_RIGHT) ``` 4. **Color Conversions:** Convert an image to grayscale or change the color mode. ```python # Convert the image to grayscale img_gray = img.convert('L') ``` 5. **Adjusting Brightness, Contrast, and Saturation:** Adjust the brightness, contrast, or saturation of an image. ```python from PIL import ImageEnhance # Increase the brightness enhancer = ImageEnhance.Brightness(img) img_brighter = enhancer.enhance(1.5) ``` 6. **Applying Filters:** Apply different types of filters, like Gaussian blur, edge detection, etc. ```python from PIL import ImageFilter # Apply a filter img_blurred = img.filter(ImageFilter.GaussianBlur(radius=5)) ``` 7. **Image Analysis:** Perform simple image analysis, like calculating the histogram. ```python # Get the histogram hist = img.histogram() ``` 8. **Image Merging:** Merge multiple images into a single image. ```python # Merge images img_merged = Image.merge('RGB', [img1, img2, img3]) ``` Please note that while these operations can be performed on a technical level, as an AI model, It cannot make aesthetic or creative decisions. Any changes It makes to an image are based on technical instructions, not creative or artistic considerations. ### Contributing Contributions are welcome! Please feel free to submit a pull request. ## [❤️](https://jugendamt-deutschland.de) Thank you for your support! If you appreciate my work, please consider supporting me: - Become a Sponsor: [Link to my sponsorship page](https://github.com/sponsors/volkansah) - :star: my projects: Starring projects on GitHub helps increase their visibility and can help others find my work. - Follow me: Stay updated with my latest projects and releases. ### 👣 other GPT stuff - [Link to ChatGPT Shellmaster](https://github.com/VolkanSah/ChatGPT-ShellMaster/) - [GPT-Security-Best-Practices](https://github.com/VolkanSah/GPT-Security-Best-Practices) - [OpenAi cost calculator](https://github.com/VolkanSah/OpenAI-Cost-Calculator) - [GPT over CLI](https://github.com/VolkanSah/GPT-over-CLI) - [Secure Implementation of Artificial Intelligence (AI)](https://github.com/VolkanSah/Implementing-AI-Systems-Whitepaper) - [Comments Reply with GPT (davinci3)](https://github.com/VolkanSah/GPT-Comments-Reply-WordPress-Plugin) - [Basic GPT Webinterface](https://github.com/VolkanSah/GPT-API-Integration-in-HTML-CSS-with-JS-PHP) ### Credits - [Volkan Kücükbudak //NCF](https://gihub.com/volkansah) - and OpenAI's ChatGPT4 with Code Interpreter for providing interactive coding assistance and insights.
weiwosuoai/WeBlog
https://github.com/weiwosuoai/WeBlog
📗 Spring Boot + Vue 3.2 + Vite 前后端分离博客~ 感谢点个 Star 呀~
# WeBlog ## 简介 一款由 Spring Boot + Vue 3.2 + Vite 4.3 开发的前后端分离博客。 ![Weblog 后台仪表盘](https://img.quanxiaoha.com/quanxiaoha/168887767469647 "Weblog 后台仪表盘") - 后端采用 Spring Boot 、Mybatis Plus 、MySQL 、Spring Sericuty、JWT、Minio、Guava 等; - 后台管理基于 Vue 3.2 + Vite + Element Plus 纯手动搭建的管理后台,未采用任何 Admin 框架; - 支持博客 Markdown 格式发布与编辑、文章分类、文章标签的管理; - 支持博客基本信息的设置,以及社交主页的跳转; - 支持仪表盘数据统计,Echarts 文章发布热图统计、PV 访问量统计; ## 相关地址 - GitHub 地址:[https://github.com/weiwosuoai/WeBlog](https://github.com/weiwosuoai/WeBlog) - Gitee 地址:[https://gitee.com/AllenJiang/WeBlog](https://gitee.com/AllenJiang/WeBlog) - 演示地址:http://118.31.41.16/ - 游客账号:test - 游客密码:test > PS: 演示环境的服务器配置很低,带宽很小,若打开速度较慢,你可以部署到本地来访问哟~ ## 演示截图 ### 登录页 ![Weblog 登录界面](https://img.quanxiaoha.com/quanxiaoha/168887753888612 "Weblog 登录界面") ### 仪表盘 ![Weblog 后台仪表盘](https://img.quanxiaoha.com/quanxiaoha/168887767469647 "Weblog 后台仪表盘") ### 文章管理 ![Weblog 文章管理](https://img.quanxiaoha.com/quanxiaoha/168888895520650 "Weblog 文章管理") ### 写博客 ![写博客](https://img.quanxiaoha.com/quanxiaoha/168887786123214 "写博客") ### 前台首页 ![博客首页](https://img.quanxiaoha.com/quanxiaoha/168950337932139 "博客首页") ### 博客详情 ![博客详情页](https://img.quanxiaoha.com/quanxiaoha/168950345381866 "博客详情页") ### 归档页 ![Weblog 归档页](https://img.quanxiaoha.com/quanxiaoha/168950349918012 "Weblog 归档页") ## 功能 ### 前台 | 功能 | 是否完成 | | ----------- | -------- | | 首页 | ✅ | | 分类列表 | ✅ | | 标签标签 | ✅ | | 博客详情 | ✅ | | 站内搜索 | TODO | | 知识库 Wiki | TODO | | 博客评论 | TODO | ### 后台 | 功能 | 是否完成 | | ---------- | -------- | | 后台登录页 | ✅ | | 仪表盘 | ✅ | | 文章管理 | ✅ | | 分类管理 | ✅ | | 标签管理 | ✅ | | 博客设置 | ✅ | | 评论管理 | TODO | ## 模块介绍 ![WeBlog 项目模块一览](https://img.quanxiaoha.com/quanxiaoha/169077198059341 "WeBlog 项目模块一览") | 项目名 | 说明 | | ----------------- | ------------------------------------------------ | | weblog-springboot | 后端项目 | | weblog-vue3 | 前端项目 | | sql | 数据库初始化脚本(包括表结构以及相关初始化数据) | ### 后端项目模块介绍 | 模块名 | 说明 | | -------------------- | -------------------- | | weblog-module-admin | 博客后台管理模块 | | weblog-module-common | 通用模块 | | weblog-module-jwt | JWT 认证、授权模块 | | weblog-web | 博客前台(启动入口) | ## 技术栈 ### 后端 | 框架 | 说明 | 版本号 | 备注 | | ------------------- | ------------------------ | ----------- | ------------------------------------------ | | JDK | Java 开发工具包 | 1.8 | 它是目前企业项目比较主流的版本 | | Spring Boot | Web 应用开发框架 | 2.6.3 | | | Maven | 项目构建工具 | 3.6.3 | 企业主流的构建工具 | | MySQL | 数据库 | 5.7 | | | Mybatis Plus | Mybatis 增强版持久层框架 | 3.5.2 | | | HikariCP | 数据库连接池 | 4.0.3 | Spring Boot 内置数据库连接池,号称性能最强 | | Spring Security | 安全框架 | 2.6.3 | | | JWT | Web 应用令牌 | 0.11.2 | | | Lombok | 消除冗余的样板式代码 | 1.8.22 | | | Jackson | JSON 工具库 | 2.13.1 | | | Hibernate Validator | 参数校验组件 | 6.2.0.Final | | | Logback | 日志组件 | 1.2.10 | | | Guava | Google 开源的工具库 | 18.0 | | | p6spy | 动态监测框架 | 3.9.1 | | | Minio | 对象存储 | 8.2.1 | 用于存储博客中相关图片 | | flexmark | Markdown 解析 | 0.62.2 | | ### 前端 | 框架 | 说明 | 版本号 | | ------------ | ------------------------------- | ------- | | Node | JavaScript 运行时环境 | 18.15.0 | | Vue 3 | Javascript 渐进式框架 | 3.2.47 | | Vite | 前端项目构建工具 | 4.3.9 | | Element Plus | 饿了么基于 Vue 3 开源的组件框架 | 2.3.3 | | vue-router | Vue 路由管理器 | 4.1.6 | | vuex | 状态存储组件 | 4.0.2 | | md-editor-v3 | Markdown 编辑器组件 | 3.0.1 | | windicss | CSS 工具类框架 | 3.5.6 | | axios | 基于 Promise 的网络请求库 | 1.3.5 | | Echarts | 百度开源的数据可视化图表库 | 5.4.2 | ## 数据库初始化脚本 初始化脚本位于 `sql` 模块中,目录如下: ![](https://img.quanxiaoha.com/quanxiaoha/169077045489682) 小伙伴们在部署时,请先新建一个名为 `weblog` 的库,然后依次执行 `schema.sql` 和 `data.sql` 即可: - `schema.sql` : 表结构脚本; - `data.sql` : 初始化数据脚本,如登录用户信息、博客基本设置信息等;
horsefacts/trading-days
https://github.com/horsefacts/trading-days
A Uniswap v4 hook that reverts when markets are closed in New York.
# 🗽 trading-days ![Build Status](https://github.com/horsefacts/trading-days/actions/workflows/.github/workflows/test.yml/badge.svg?branch=main) A Uniswap v4 hook that reverts when markets are closed in New York. ## Views - `isCoreTradingHours`: Return `true` between 9:30 AM and 4:00 PM ET. - `isTradingDay`: Return `true` Mon-Fri, if it's not a NYSE holiday. - `isHoliday`: Return `true` if it's a NYSE holiday. - `getHoliday`: Get the current holiday (see the `Holiday` enum). - `isDST`: Return `true` if it's Daylight Savings Time in New York. - `time`: Adjust `block.timestamp` so that UTC date calculations are localized to US Eastern Time. Subtracts either 4 or 5 hours, depending on whether it's DST. - `marketIsOpen`: Return `true` if the market is currently open. - `state`: Get the current state of the market, one of: - `HOLIDAY` - `WEEKEND` - `AFTER_HOURS` - `OPEN` ## Errors - `ClosedForHoliday(Holiday holiday)`: Markets are closed for a [NYSE holiday](https://www.nyse.com/markets/hours-calendars). Error data includes a Holiday enum, one of: - `NEW_YEARS_DAY` - `MARTIN_LUTHER_KING_JR_DAY` - `WASHINGTONS_BIRTHDAY` - `GOOD_FRIDAY` - `MEMORIAL_DAY` - `JUNETEENTH_NATIONAL_INDEPENDENCE_DAY` - `INDEPENDENCE_DAY` - `LABOR_DAY` - `THANKSGIVING_DAY` - `CHRISTMAS_DAY` - `NEW_YEARS_DAY_OBSERVED` - `ClosedForWeekend`: Markets are closed for the weekend. - `AfterHours`: Markets are closed on weekdays before 9:30 AM and after 4:00 PM ET. ## Events - `DingDingDing`: If you perform the first swap of the day, you get to ring the opening bell! ## Deployments | Network | TradingView | | ------- | ----------------------------------------------------------------------------------------------------------------------------------- | | Goerli | [`0x000000006016c17169c289346015b3c9d42a1218`](https://goerli.etherscan.io/address/0x000000006016c17169c289346015b3c9d42a1218#code) | | Mainnet | [`0x000000006016c17169c289346015b3c9d42a1218`](https://etherscan.io/address/0x000000006016c17169c289346015b3c9d42a1218#code) | ## Technical Details The NYSE holiday calendar and Daylight Savings start/end timestamps are stored as [data contracts](https://github.com/dragonfly-xyz/useful-solidity-patterns/tree/main/patterns/big-data-storage). NYSE holidays were precalculated through 2123 using the Python [holidays](https://pypi.org/project/holidays/) package. Each 13-byte sequence encodes one year, which includes up to 11 holidays. Each holiday is encoded as 9 bits, 4 for the month and 5 for the day. A year may have 9, 10, or 11 holidays, depending on whether New Year's Day of the next year falls on a Saturday. The start and end timestamps for Daylight Savings were precalculated through 2123 using the `calculate_dst.py` script in this repo. The data contract stores each start/end pair as an 8-byte sequence, encoding two 32-bit values representing seconds since Jan 1, 2023. These represent the exact start and end timestamp of Daylight Savings Time in New York, according to current [DST rules](https://www.nist.gov/pml/time-and-frequency-division/popular-links/daylight-saving-time-dst). (That is, DST starts at 2am local time on the second Sunday of March and ends 2am local time on the first Sunday of November). ## Acknowledgments Inspired by [Mariano's](https://github.com/nanexcool) legendary "office hours" modifier: <a href="https://twitter.com/nanexcool/status/1259623747339849729" target="_blank"> <img src="./img/office_hours.png" width=480px /> </a> Thanks to: - [saucepoint](https://twitter.com/saucepoint) for their [v4 hooks template](https://github.com/saucepoint/v4-template) - [BokkyPooBah](https://twitter.com/BokkyPooBah) for [BokkyPooBahsDateTimeLibrary](https://github.com/bokkypoobah/BokkyPooBahsDateTimeLibrary)
Ambershawl656/AM
https://github.com/Ambershawl656/AM
null
Real Time Operating System Lessons using Arduino and The FreeRTOS Kernel learning arduino examples atmel freertos learning-exercise arduino-boards arduino-platform example-project arduino-uno rtos arduino-mega example-repo example-codes# AM
mierenhoop/advent-of-beans
https://github.com/mierenhoop/advent-of-beans
null
# Advent of beans Advent of code clone built with [Redbean](https://redbean.dev/). ## Status *I will pause development on this project because it is currently in a working state. I did not plan to create actual puzzles and host an event, therefore adding new features might be a waste of time. If anyone is interested in using this project for hosting their event, create a github issue or contact me so I can help you get it set up.* As of the time of writing it has the following features: * Page listing all the puzzles * Puzzle can have starting time * Puzzle inputs are stored in buckets, some users will share a bucket * Wrong answer submit timeout * Leaderboard for everything & individual puzzles * Silver and gold stars * User profile page * Github integration * Cached remote resources to enable use in private networks * All-in-one binary with control of the database via the command line * No javascript Todo list: * Rate limiting/DDOS protection * Support multiple events
Venusdev2113/javascript-serveral-animation-effect
https://github.com/Venusdev2113/javascript-serveral-animation-effect
I made the project including a lot of animation effect.
# javascript-serveral-animation-effect I made the project including a lot of animation effect.
lablab-ai/Google-VertexAI-FastAPI
https://github.com/lablab-ai/Google-VertexAI-FastAPI
Simple boilerplate to get started with Generative AI models from Google Vertex AI based on FastAPI
# Getting started with Vertex AI Generative AI ## Before you begin This is a simple starter boilerplate that gives you a basic FastAPI setup with a few endpoints. It is meant to be used as a starting point for your own projects. ### Clone and install dependencies In your terminal, run the following commands: ``` git clone [email protected]:lablab-ai/Google-VertexAI-FastAPI.git cd Google-VertexAI-FastAPI cd app pip install -r requirements.txt ``` ### Update the project auth In order to use the Vertex AI SDK, you will need to update the project auth using a serviceaccount In `app`, folder create the file `service_account.json` and paste the content of your service account json file. Create the file if you don't have it by runnung the following command in your terminal: `touch service_account.json` In the file `service_account.json` paste the content of your service account json file. It should look like this: ``` { "type": "service_account", "project_id": "YOUR_PROJECT_ID", "private_key_id": "YOUR_PRIVATE_KEY_ID", "private_key": "YOUR_PRIVATE_KEY", "client_email": "YOUR_CLIENT_EMAIL", "client_id": "YOUR_CLIENT_ID", "auth_uri": "YOUR_AUTH_URI", "token_uri": "YOUR_TOKEN_URI", "auth_provider_x509_cert_url": "YOUR_AUTH_PROVIDER_X509_CERT_URL", "client_x509_cert_url": "YOUR_CLIENT_X509_CERT_URL", "universe_domain": "YOUR_UNIVERSE_DOMAIN" } ``` You can find your service account json file in the Vertex AI console under `Settings > Service account` or you got it provided by lablab.ai (If you are part of the Google Vertex AI hackathon ) ### Start the server and test Once you have installed the dependencies, you can start the server by running: `uvicorn main:app --reload --port 8080` in the `app` directory. When the server is running, you can test it by going to `http://localhost:8080/docs` in your browser. You should see the Swagger UI where you can test the endpoints. ![image](https://github.com/lablab-ai/Google-VertexAI-FastAPI/assets/2171273/13df1172-0b77-43f3-85a0-0bf936bbd8db) ![image](https://github.com/lablab-ai/Google-VertexAI-FastAPI/assets/2171273/e69f7892-6945-4d85-987e-dbbc23e553bd) Good luck! and don't forget to star this repo if you like it! **Thank you** for reading! If you enjoyed this tutorial you can find more and continue reading [on our tutorial page](https://lablab.ai/t/) --- [![Artificial Intelligence Hackathons, tutorials and Boilerplates](https://storage.googleapis.com/lablab-static-eu/images/github/lablab-banner.jpg)](https://lablab.ai) ## Join the LabLab Discord ![Discord Banner 1](https://discordapp.com/api/guilds/877056448956346408/widget.png?style=banner1) On lablab discord, we discuss this repo and many other topics related to artificial intelligence! Checkout upcoming [Artificial Intelligence Hackathons](https://lablab.ai) Event [![Acclerating innovation through acceleration](https://storage.googleapis.com/lablab-static-eu/images/github/nn-group-loggos.jpg)](https://newnative.ai)
DeepGraphLearning/ProtST
https://github.com/DeepGraphLearning/ProtST
Camera-ready repo for ProtST
# ProtST: Multi-Modality Learning of Protein Sequences and Biomedical Texts ProtST is an advanced pretraining framework for protein sequence understanding and prediction, as introduced in our [ICML2023 oral paper](https://arxiv.org/abs/2301.12040). It is designed to enhance protein sequence pre-training and understanding by integrating protein functions and other important properties through biomedical texts. The effectiveness and superiority of ProtST-induced PLMs over previous ones are demonstrated on diverse representation learning downstream tasks and zero-shot predictions. It also enables functional protein retrieval from large-scale databases even without any function annotation, as illustrated below. ![ProtST](asset/framework.png) # Installation # You may install the dependencies of TorchProtein and ProtST as below. Generally, they work with Python 3.7/3.8 and PyTorch version >= 1.8.0. ```bash conda create -n protein python=3.9 conda activate protein conda install pytorch==2.0.0 pytorch-cuda=11.7 -c pytorch -c nvidia conda install torchdrug pytorch-sparse pytorch-scatter pytorch-cluster -c pytorch -c pyg -c milagraph conda install scikit-learn pandas decorator ipython networkx tqdm matplotlib -y conda install fair-esm transformers easydict pyyaml lmdb -c conda-forge ``` # Pre-trained Model Zoo | Model | Config | Ckpt | |:---------------:|:------------------------------------------------:|:-------------------------------------------------------------------------------------------------------------------------------------------------------:| | ProtST-ESM-1b | [config](config/pretrain/pretrain_esm.yaml) | [ckpt](https://protsl.s3.us-east-2.amazonaws.com/checkpoints/protst_esm1b.pth) | | ProtST-ESM-2 | [config](config/pretrain/pretrain_esm.yaml) | [ckpt](https://protsl.s3.us-east-2.amazonaws.com/checkpoints/protst_esm2.pth) | | | ProtST-ProtBert | [config](config/pretrain/pretrain_protbert.yaml) | [ckpt](https://protsl.s3.us-east-2.amazonaws.com/checkpoints/protst_protbert.pth) | # Usage To reproduce all the experiments in ProtST, we provide all the necessary configuration files at `config/.../*.yaml`, which are categorized by the dataset, model architecture, and hyperparameters. When running experiments, we specify the configuration file with an argument `--config` and all the required arguments marked by `{{ }}` in that configuration file. Note that all the datasets will be automatically downloaded in the code. But if you are using clusters without Internet connection, please run `python ./script/prepare_all_datasets.py` to cache datasets in advance. ## Pre-training By default, we pretrain 3 different PLM backbones (ESM-1b, ESM2 and ProtBert) using 4 V100 GPUs with the following command. Note that we have the choice of using two versions of text encoders: PebMedBert trained with only abstracts `PebMedBert-abs` and PebMedBert trained with full papers `PebMedBert-full`. ``` alias python4proc='python -m torch.distributed.launch --nproc_per_node=4' # pretrain ESM-1b python4proc script/run_pretrain.py --config ./config/pretrain/pretrain_esm.yaml --protein_model ESM-1b --text_model PubMedBERT-abs # pretrain ESM-2 python4proc script/run_pretrain.py --config ./config/pretrain/pretrain_esm.yaml --protein_model ESM-2-650M --text_model PubMedBERT-abs # pretrain ProtBert python4proc script/run_pretrain.py --config ./config/pretrain/pretrain_protbert.yaml --text_model PubMedBERT-abs ``` ## Downstream Tasks: Representation Learning For representation learning, we verify our pre-trained multimodal PLMs on 11 standard benchmarks for protein localization prediction, fitness landscape prediction and protein function annotation, under both fix-encoder learning and full-model tuning settings. We label the pretrained checkpoints as `PRETRAIN_CHECKPOINT`. For different PLM backbone, the corresponding configuration files are in `./config/downstream_task/.../*.yaml`. We give a demonstration for ProtST-enhanced ESM-1b. ### Protein Localization Prediction For binary localization prediction, you can run as below to perform fix-encoder learning and full-model tuning, respectively: ``` # fix-encoder learning python4proc ./script/run_downstream.py --config ./config/downstream_task/PretrainESM/localization_fix.yaml --checkpoint $PRETRAIN_CHECKPOINT --dataset BinaryLocalization --num_class 2 # full-model tuning python4proc ./script/run_downstream.py --config ./config/downstream_task/PretrainESM/localization_tune.yaml --checkpoint $PRETRAIN_CHECKPOINT --dataset BinaryLocalization --num_class 2 ``` **Note that**, subcellular localization can be performed in the similar way (please see `./config` for details). ### Fitness Landscape Prediction For Beta-Lactamase fitness prediction, you can run as below to perform fix-encoder learning and full-model tuning, respectively: ``` # fix-encoder learning python4proc ./script/run_downstream.py --config ./config/downstream_task/PretrainESM/fitness_fix.yaml --checkpoint $PRETRAIN_CHECKPOINT --dataset BetaLactamase --batch_size 32 # full-model tuning python4proc ./script/run_downstream.py --config ./config/downstream_task/PretrainESM/fitness_tune.yaml --checkpoint $PRETRAIN_CHECKPOINT --dataset BetaLactamase --batch_size 6 ``` **Note that**, Fluorescence, Stability, AAV and Thermostability prediction can be performed in the similar way (please see `./config` for details). ### Protein Function Annotation For Enzyme Commission (EC) number prediction, you can run as below to perform full-model tuning: ``` python4proc ./script/run_downstream.py --config ./config/downstream_task/PretrainESM/annotation_tune.yaml --checkpoint $PRETRAIN_CHECKPOINT --dataset td_datasets.EnzymeCommission --branch null ``` **Note that**, the Gene Ontology (GO) term prediction at Molecular Function (MF), Biological Process (BP) and Cellular Component (CC) branches can be performed in the similar way (please see `./config` for details). ## Downstream Tasks: Zero-shot Protein Classification ### Zero-shot Predictors ProtST supports zero-shot protein classification, where it does not require any labeled protein. This is achieved by comparing representation similarities between a query protein and all labels, thanks to the aligned representation space of protein sequences and label descriptions in ProtST. We demonstrate on zero-shot subcellular localization prediction and zero-shot reaction classification with ProtST-enhanced ESM-1b. We have also explored different prompt templates and description fields as listed in `./data/zero_shot_classification/`. ``` # Subcellular Localization Prediction python ./script/run_zero_shot.py --config ./config/zero_shot/PretrainESM/zero_shot.yaml --checkpoint $PRETRAIN_CHECKPOINT --prompt_label ./data/zero_shot_classification/subloc_name.tsv --dataset SubcellularLocalization --field "['name']" # Reaction Classification python ./script/run_zero_shot.py --config ./config/zero_shot/PretrainESM/zero_shot.yaml --checkpoint $PRETRAIN_CHECKPOINT --prompt_label ./data/zero_shot_classification/reaction_name.tsv --dataset Reaction --field "['name']" ``` ### Few-shot and Non-parametric Baselines ProtST-induced zero-shot classifiers have better data efficiency against various few-shot and non-parametric classifiers. You can run these baselines as below: ``` # few-shot classifiers ## Subcellular Localization Prediction python ./script/run_few_shot.py --config ./config/few_shot/PretrainESM/few_shot.yaml --dataset SubcellularLocalization --num_class 10 --checkpoint $PRETRAIN_CHECKPOINT ## Reaction Classification python ./script/run_few_shot.py --config ./config/few_shot/PretrainESM/few_shot.yaml --dataset Reaction --num_class 384 --checkpoint $PRETRAIN_CHECKPOINT # non-parametric few-shot classifiers ## Subcellular Localization Prediction python ./script/run_few_shot_nonparam.py --config ./config/few_shot/PretrainESM/few_shot.yaml --dataset SubcellularLocalization --num_class 10 --checkpoint $PRETRAIN_CHECKPOINT ## Reaction Classification python ./script/run_few_shot_nonparam.py --config ./config/few_shot/PretrainESM/few_shot.yaml --dataset Reaction --num_class 384 --checkpoint $PRETRAIN_CHECKPOINT ``` ### Predictor Ensemble We also show that ProtST-based zero-shot predictor can enhance the performance of supervised learning models via ensemble. We use the following scripts to do ensembles, where `SUPERVISED_CHECKPOINT` refers to the checkpoints obtained by supervised learning on downstream tasks. ``` ## Subcellular Localization Prediction python ./script/run_supervised_with_zero.py -sc ./config/downstream_task/PretrainESM/localization_fix.yaml -zc ./config/zero_shot/zero_shot.yaml --dataset SubcellularLocalization --num_class 10 --prompt_label ./data/zero_shot_classification/subloc_name.tsv --field "['name']" --checkpoint $PRETRAIN_CHECKPOINT --supervised_checkpoint $SUPERVISED_CHECKPOINT ## Reaction Classification python ./script/run_supervised_with_zero.py -sc ./config/downstream_task/PretrainESM/reaction_tune.yaml -zc ./config/zero_shot/zero_shot.yaml --dataset Reaction --num_class 384 --prompt_label ./data/zero_shot_classification/reaction_name.tsv --field "['name']" --checkpoint $PRETRAIN_CHECKPOINT --supervised_checkpoint $SUPERVISED_CHECKPOINT ``` ## Downstream Tasks: Text to Protein Retrieval We illustrate the capability of ProtST-ESM-1b on retrieving functional proteins as below, where no function annotation is required: ``` python ./script/run_t2p_retrieval.py --config ./config/t2p_retrieval/go_mf.yaml --checkpoint $PRETRAIN_CHECKPOINT ``` # License This codebase is released under the Apache License 2.0 as in the [LICENSE](LICENSE) file. # Citation If you find this project helpful, please cite our paper: ``` @article{xu2023protst, title={ProtST: Multi-Modality Learning of Protein Sequences and Biomedical Texts}, author={Xu, Minghao and Yuan, Xinyu and Miret, Santiago and Tang, Jian}, journal={arXiv preprint arXiv:2301.12040}, year={2023} } ``` # Contact For any questions or issues, open an issue or contact Minghao Xu ([email protected]) and Xinyu Yuan ([email protected]).
CarlJamison/cosmic-air-host
https://github.com/CarlJamison/cosmic-air-host
Store entire html pages in a single url
Compresses and creates a link for simple, self-contained html sites. Imagine a scenario where you have a quick proof of concept or cool visual that you want to show to someone, but don't feel like hosting it anywhere. For example, something like [this](https://carljamison.github.io/TimeKiller/). You can use this project to create a link, where all the scripts and styling are stored in the url. In the above case, [this](https://air-host.azurewebsites.net/h/sVW7W_iNhj_V0Kk0bgxNKHlJjkYdGq560n0qMqkXZWLthxxIRrElRPoKsz_vsd2Agm9Tvu2DwQ_b_bPz6vlYCC_WPKj7BNvOJBnlpzQD1eelSZ0_XodZ9s4t-SU-l7vaji4kGegkw1lmM_jFaN9vORiTsNI_-c0tG9sbM_gdwu_qR3hfMlYoRSe49XKrNab4ppvsoJ6-Dlnm4RTH9-Mr6d399PZ-GEG1Pjr-OHz4x-fP375Or6BowMZje_uf3uktmUHMlzFeTGr9pVnVCK7gmojPC_-pvKsK6XdS4AE7FLIR7n_PU2K5YXGDdxJyb1l6WJZlOxA2YH-9FzTWJGgODFkIF05kTt9ozCN1OEHJIYOpGOJTfZJxGvmoJ0MGZU40C5SH7lxGB22WDdh4AqWIKTd1pV_Kv4ufXJqMkWybrxgrqs82qKsq5TbbeCyjInFawdYa1bEP_gqzdcI2_c2PWhVSgO1KMSRYbarK8IZQ9ZdMLHWm2u4sBfChz2oJ33pG00wXKVPrEjXDMmtteVpYv2VrlbXgsXFRjBHuTEU1A8qc2ViMDTvIahgz4Inm7kyQlgp3NcUTrVh_5_qyc4d37I1HAHi_V5psCfB8qVJCcfgkbgDETy3b8GGWLarA_Li2t-L78XsmbFEMaXw2eW5EV0IhPYqoDWQO71TMNsIARmcpNniehkLB9ISnCX4C3znfIUCiF3Hb1GhDDhkmri47PgIF1T8AgucldgDudVBy-B-GYX7ItzzKK0KRHN1YYWQrDnSwHL5giKEfQ8UXbemeiyrY-TOabeP4yQ5BGcHIImG6nIMUIkG7BZYBYiYOOEqvuQYap2oJMPHfCP15MMqe4hJImyOJhUGDFlDvD3C3h7yxVcurSMCp-oieN7kSyBMwwg1xCjU6CLKlFEzybCAe-5qbaPdbolRlb3fbPKzfSTBhwKjGonJKVVr0Lmqong3zN_eDfOhuuhlUA85RNiqIXF5icV1dCYgjcnb79kqZ5ZCcVtDAbQkLeoUTSwqgx3ImwYWpIqxqAq20UBHzhFdpYEb3iywjxBxikNjadjjo3mDryJ6wD1teO-_IFbeU1cjCEr4rfZtUxv_i-r0RFU5rfh__FFleKMn6XRasOI-37CEP8RZwtfQlXxKxWgNapDNGj90BnMD0jsVGYHrI3J5KnIPVlcgarffmkGeaUxvEhqqCHOAt7UAnSPgfI4gjCOfaBpzOFDRfUO7lfzXSt4x8kvS8d-coGPz3gmmJoCOQh6dnFbKQOL60cnJpcxt2HVO7WBbIolCpDQAg5mmAzPGhjDR1UIYWgA9hWyxO7Yu3bJcI9hJVS8fcZ3WBJq82rF0b3ktYFJawkAj552mc0QGvcxEhgoTG8oh16A9GqzNDFFug5iaR9KoeisRs4A5bhZmRjUG3k5e6LcHUV0yEAM5DYTrIsXlwOADOQlgA1OpBhxX94XBVIwKc1394AqyFj2-cbSOGkowe-HPvOjCLFJePFLUFosfju32-h_OYWi5NrbeI5B5rkk6K17hNXnYAjOalZIHNgfnlu8xXv7Xvgif4APL_f7NyIP5ObnoIT3yYD1Vaz3yfO847vqeZ-acalW1SQczVM830C1nG0zfw1iDV2RaPfoglnL5D8p4-45SCwAA) would be result. This is obviously only useful in specific limited scenarios. If the html is not self-contained, or references scripts, stylings, images, etc. that it can't access, than this won't work. If the html is too complex, than this will also not work. This is intended for small, quick projects where it might not make sense to host it anywhere.
luolin-ai/FastGPT-Enterprise-WeChatbot
https://github.com/luolin-ai/FastGPT-Enterprise-WeChatbot
基于企微gpt知识库的bot机器人,能够自动回复企业微信中收到的消息。这个机器人能够处理私聊和群聊,还可以记住与用户的聊天内容,从而做出更加贴合上下文的回应。此外,您还可以设置白名单来控制机器人与哪些用户或群组交互。如需对接自己的知识库网站WX:aiwis99
# FastGPT-Enterprise-WeChatbot # Enterprise-WeChat-GPTbot 基于企业微信外部群的微信机器人,支持FASTgpt # 企业微信机器人使用教程: 1.打开项目下载企业微信4.0.8.6027版本 下载:https://drive.weixin.qq.com/s?k=ANMA2gc2ACsWAjgBOT 如需对接自己的知识库网站WX:gojiberrys 2.配置, 打开https://fastgpt.run/网站,点击创建ai ![image](https://github.com/luolin-ai/Enterprise-WeChat-GPTbot/assets/135555634/49b7e76a-2908-4431-a9e1-26ceb1702ebb) 进入网站点击新建应用,点击生成key ![微信截图_20230703101749](https://github.com/luolin-ai/Enterprise-WeChat-GPTbot/assets/135555634/2d4fbf96-879d-4c71-aae8-04e152616829) ![image](https://github.com/luolin-ai/Enterprise-WeChat-GPTbot/assets/135555634/bb9e7f3c-0e91-4cd2-ab2f-13d35a92b3e3) ![image](https://github.com/luolin-ai/Enterprise-WeChat-GPTbot/assets/135555634/a3987928-8271-4fd2-8b5f-c7af94360ecc) # 扫码加入交流群 ![49315136d4c528b768545a1f9cb9ac3](https://github.com/luolin-ai/FastGPT-Enterprise-WeChatbot/assets/135555634/60245c03-71c1-4536-ac55-ef69f7cc90ea) 大家好!我非常高兴地告诉大家,我已经成功完成了我的代码项目!通过不懈的努力和团队合作,我们成功实现了一个更智能的企业微信聊天机器人。 在这个过程中,我意识到为了进一步改进项目并提供更好的用户体验,我们需要一些资金支持。因此,我希望能够向大家征求赞助。这些资金将用于购买必要的资源、改进代码和增加功能,以确保我们能够提供最好的服务。 作为赞助者,您将享有以下特权和回报: 优先体验新功能和更新:您将成为第一个尝试和体验项目的最新功能和改进的人。 参与项目决策和功能讨论:您的声音和意见对项目的发展至关重要,您将被邀请参与项目的决策和功能讨论。 特别感谢和认可:作为项目的赞助者,您将得到我们的特别感谢和认可,您的支持将在项目中被特别提及。 如果您愿意支持我和我的项目,您可以通过以下方式进行赞助: 不管您是否能够提供赞助,我都非常感谢您对我项目的关注和支持。如果您有任何问题、建议或意见,也请随时与我交流。让我们一起为项目的成功努力! 谢谢大家!无论您是否能够赞助,您的支持对我来说都非常重要。让我们共同见证这个项目的成长和进步! ![1e466e5accd00e5851137bd45afca449](https://github.com/luolin-ai/Enterprise-WeChat-GPTbot/assets/135555634/9148c9ec-6b13-42a7-a73c-9f8d75984d01)
nickrosh/evol-teacher
https://github.com/nickrosh/evol-teacher
Open Source WizardCoder Dataset
# Evol-Teacher: Recreating WizardCoder This repo implements the code generation instruction process described in the [WizardCoder Paper](https://arxiv.org/pdf/2306.08568.pdf). Currently, WizardCoder is one the most performant Code Generation models, being beaten only by ChatGPT. This takes the Code Alpaca 20k dataset and evolves each instruction through a randomly chosen evolution prompt to increase instruction complexity. These prompts range from increase time/space complexity, to increasing requirements, to adding erroneus code to improve robustness, etc. This is done three times with pruning and post processing to remove unwanted instructions and responses. The iterative addition of more complexity gives higher quality and more in-depth instructions than what is ususally generated in Alpaca methods. This, like in the case of WizardCoder and WizardLM, can lead to strong performance that gets very close to RLHF model performance. `generate_evol.py` allows you to generate an Evolution-Instruct dataset from any instruction dataset in the format `Instruction`/`Response`. Alpaca style datasets that contain `input` fields can be converted to Evolution format with `convert_alpaca_to_evol()`. The high level overview of the evolution process is as follows: 1. A seed instruction is taken and evolved with a randomly chosen evolution prompt using GPT3.5. 2. Responses are generated to each of these new evolved prompts also with GPT3.5. 3. Poor quality instructions and responses are pruned and also prevented from further evolution. 4. This evolution process repeats M times. In the paper and the default value in this repo, M=3. As described in the paper, I performed this process on the full 20k Code Alpaca dataset with three evolutions, resulting in a total of 80k instruction-response pairs. Over 120,000 API calls were made to OpenAI to create this dataset, and due to the rate limit, it took around three days to complete. ## Getting the full 80k Dataset The full dataset can be found on [HuggingFace Hub](https://huggingface.co/datasets/nickrosh/Evol-Instruct-Code-80k-v1). You can also easily recreate the full dataset by running `merge_evolutions(output_dir="./data/EvolInstruct-Code-80k/")` within `generate_evol.py`. This will merge the seed dataset and the three evolutions. The evolutions are saved here separately for later analysis on generational differences. ## Fine Tuning We can instruct-tune a model using this dataset very similarly to Alpaca tuning. Simply run `train.py` with your desired parameters. If you set the model max length to 512, it will have a much smaller memory footprint and you will be able to train faster. I instruct-tuned [ReplitLM](https://github.com/replit/ReplitLM) on the full 80k dataset using the following parameters: ```bash --model_name_or_path replit/replit-code-v1-3b \ --data_path ./data/EvolInstruct-Code-80k/EvolInstruct-Code-80k.json \ --output_dir ./checkpoints \ --num_train_epochs 3 \ --per_device_train_batch_size 1 \ --gradient_accumulation_steps 8 \ --evaluation_strategy "no" \ --save_strategy "steps" \ --save_steps 50 \ --save_total_limit 2 \ --learning_rate 2e-5 \ --weight_decay 0. \ --warmup_ratio 0.03 \ --lr_scheduler_type "cosine" \ --logging_steps 1 \ --model_max_length 2000 \ --bf16 True \ --tf32 True ``` The instruction tuned version of the model can be found on [HuggingFace Hub](https://huggingface.co/nickrosh/Evol-Replit-v1) ## Evaluation When this model is run with Greedy Decoding, it achieves a HumanEval Pass@1 Score of 31.1%, up from the initial value of 21% for ReplitLM. Note that this is not the same foundational model as the original WizardCoder, which was fine tuned on top of StarCoder, which has a score of 34% out of the box. To run HumanEval on this model, follow the package download instructions on the HumanEval repo and then run the following script `sh human_eval.sh`. ```bash #!/bin/sh model="./checkpoints" # Path to your model temp=0.0 max_len=2048 pred_num=1 num_seqs_per_iter=1 output_path=preds/T${temp}_N${pred_num} mkdir -p ${output_path} echo 'Output path: '$output_path echo 'Model to eval: '$model # Default Sampling: temp=0.2, pred_num=200, num_seqs_per_iter=2 # CUDA_VISIBLE_DEVICES=0 python humaneval_gen.py --model ${model} \ # --temperature ${temp} --num_seqs_per_iter ${num_seqs_per_iter} --N ${pred_num} \ # --max_len ${max_len} --output_path ${output_path} # Greedy Decoding: Also set temp=0.0, pred_num=1, and num_seqs_per_iter=1 CUDA_VISIBLE_DEVICES=0 python humaneval_gen.py --model ${model} \ --temperature ${temp} --num_seqs_per_iter ${num_seqs_per_iter} --N ${pred_num} \ --max_len ${max_len} --output_path ${output_path} --greedy_decode output_path=preds/T${temp}_N${pred_num} echo 'Output path: '$output_path python process_humaneval.py --path ${output_path} --out_path ${output_path}.jsonl --add_prompt evaluate_functional_correctness ${output_path}.jsonl ``` Please refer to the WizardCoder repo for an example of how to run HumanEval on multiple GPUs. ## Citation Thanks to the original WizardCoder team ``` @misc{luo2023wizardcoder, title={WizardCoder: Empowering Code Large Language Models with Evol-Instruct}, author={Ziyang Luo and Can Xu and Pu Zhao and Qingfeng Sun and Xiubo Geng and Wenxiang Hu and Chongyang Tao and Jing Ma and Qingwei Lin and Daxin Jiang}, year={2023}, eprint={2306.08568}, archivePrefix={arXiv}, primaryClass={cs.CL} } ``` And thanks to the original Alpaca team. ``` @misc{alpaca, author = {Rohan Taori and Ishaan Gulrajani and Tianyi Zhang and Yann Dubois and Xuechen Li and Carlos Guestrin and Percy Liang and Tatsunori B. Hashimoto }, title = {Stanford Alpaca: An Instruction-following LLaMA model}, year = {2023}, publisher = {GitHub}, journal = {GitHub repository}, howpublished = {\url{https://github.com/tatsu-lab/stanford_alpaca}}, } ``` And thanks to sahil280114 for the CodeAlpaca project and seed dataset. ``` @misc{codealpaca, author = {Sahil Chaudhary}, title = {Code Alpaca: An Instruction-following LLaMA model for code generation}, year = {2023}, publisher = {GitHub}, journal = {GitHub repository}, howpublished = {\url{https://github.com/sahil280114/codealpaca}}, } ``` Naturally, you should also cite the original LLaMA paper [1] and the Self-Instruct paper [2]. Also thanks to [Teknium1](https://github.com/teknium1/stanford_alpaca-replit) for the Replit Training Script. I made some changes for the Evolution Instruct format input.
juri/Parse3339
https://github.com/juri/Parse3339
A fast pure Swift RFC 3339 timestamp parser
[![Build](https://github.com/juri/Parse3339/actions/workflows/build.yml/badge.svg)](https://github.com/juri/Parse3339/actions/workflows/build.yml) [![](https://img.shields.io/endpoint?url=https%3A%2F%2Fswiftpackageindex.com%2Fapi%2Fpackages%2Fjuri%2FParse3339%2Fbadge%3Ftype%3Dswift-versions)](https://swiftpackageindex.com/juri/Parse3339) [![](https://img.shields.io/endpoint?url=https%3A%2F%2Fswiftpackageindex.com%2Fapi%2Fpackages%2Fjuri%2FParse3339%2Fbadge%3Ftype%3Dplatforms)](https://swiftpackageindex.com/juri/Parse3339) # Parse3339 Parse3339 is a fast [RFC 3339] time stamp parser written in pure Swift. RFC 3339 specifies the commonly used subset of ISO 8601 suitable for time stamps. This parser restricts the subset even further. The following are the formats supported by Parse3339: - `2023-07-09T113:14:00+03:00` - `2023-07-09T113:14:00.2+03:00` - `2023-07-09T113:14:00Z` - `2023-07-09T113:14:00.2Z` There's nothing to configure, and it's all in just file in case you want to copy it over instead of using it as a package. [RFC 3339]: https://www.rfc-editor.org/rfc/rfc3339 ## Usage ```swift import Parse3339 let s = "2023-07-09T13:14:00+03:00" guard let parts = Parse3339.parse(s) else { return } let date = parts.date print(date.timeIntervalSinceReferenceDate) // output: 710590440.0 ``` There's a helper function you can use with Foundation's `JSONDecoder`: ```swift import Parse3339 let decoder = JSONDecoder() decoder.dateDecodingStrategy = .custom(Parse3339.parseFromDecoder(_:)) ``` For `Package.swift` snippets and documentation, visit the [Swift Package Index page](https://swiftpackageindex.com/juri/Parse3339). ## Speed and memory usage Parse3339 is pleasantly fast and stingy with memory usage. The package has benchmarks: ```sh env PARSE3339_BENCHMARK=1 swift package benchmark --target ParserBenchmarks ``` It has benchmarks that parse the same string using Foundation's `DateFormatter`, Foundation's `ISO8601DateFormatter`, Parse3339 creating a `Date` with Foundation's `DateComponents` and `Calendar`, and Parse3339 creating a `Date` with standard Unix functions. Output from one run: ``` Parse with DateFormatter ╒════════════════════════════╤═════════╤═════════╤═════════╤═════════╤═════════╤═════════╤═════════╤═════════╕ │ Metric │ p0 │ p25 │ p50 │ p75 │ p90 │ p99 │ p100 │ Samples │ ╞════════════════════════════╪═════════╪═════════╪═════════╪═════════╪═════════╪═════════╪═════════╪═════════╡ │ Malloc (total) │ 275 │ 275 │ 275 │ 275 │ 275 │ 275 │ 279 │ 100000 │ ├────────────────────────────┼─────────┼─────────┼─────────┼─────────┼─────────┼─────────┼─────────┼─────────┤ │ Memory (resident peak) (M) │ 11 │ 15 │ 19 │ 23 │ 25 │ 27 │ 27 │ 100000 │ ├────────────────────────────┼─────────┼─────────┼─────────┼─────────┼─────────┼─────────┼─────────┼─────────┤ │ Throughput (# / s) (K) │ 17 │ 16 │ 16 │ 16 │ 15 │ 12 │ 1 │ 100000 │ ├────────────────────────────┼─────────┼─────────┼─────────┼─────────┼─────────┼─────────┼─────────┼─────────┤ │ Time (total CPU) (ns) │ 59084 │ 59967 │ 60255 │ 61887 │ 64927 │ 82175 │ 236750 │ 100000 │ ├────────────────────────────┼─────────┼─────────┼─────────┼─────────┼─────────┼─────────┼─────────┼─────────┤ │ Time (wall clock) (ns) │ 58500 │ 59391 │ 59647 │ 61279 │ 64255 │ 83327 │ 513167 │ 100000 │ ╘════════════════════════════╧═════════╧═════════╧═════════╧═════════╧═════════╧═════════╧═════════╧═════════╛ Parse with ISO8601DateFormatter ╒════════════════════════════╤═════════╤═════════╤═════════╤═════════╤═════════╤═════════╤═════════╤═════════╕ │ Metric │ p0 │ p25 │ p50 │ p75 │ p90 │ p99 │ p100 │ Samples │ ╞════════════════════════════╪═════════╪═════════╪═════════╪═════════╪═════════╪═════════╪═════════╪═════════╡ │ Malloc (total) │ 496 │ 496 │ 496 │ 496 │ 496 │ 496 │ 497 │ 100000 │ ├────────────────────────────┼─────────┼─────────┼─────────┼─────────┼─────────┼─────────┼─────────┼─────────┤ │ Memory (resident peak) (K) │ 9764 │ 9781 │ 9781 │ 9781 │ 9781 │ 9781 │ 9781 │ 100000 │ ├────────────────────────────┼─────────┼─────────┼─────────┼─────────┼─────────┼─────────┼─────────┼─────────┤ │ Throughput (# / s) (K) │ 11 │ 10 │ 10 │ 10 │ 10 │ 7 │ 0 │ 100000 │ ├────────────────────────────┼─────────┼─────────┼─────────┼─────────┼─────────┼─────────┼─────────┼─────────┤ │ Time (total CPU) (ns) │ 91375 │ 92351 │ 92799 │ 95551 │ 99519 │ 123007 │ 542458 │ 100000 │ ├────────────────────────────┼─────────┼─────────┼─────────┼─────────┼─────────┼─────────┼─────────┼─────────┤ │ Time (wall clock) (ns) │ 90792 │ 91711 │ 92159 │ 94911 │ 98879 │ 127487 │ 3553542 │ 100000 │ ╘════════════════════════════╧═════════╧═════════╧═════════╧═════════╧═════════╧═════════╧═════════╧═════════╛ Parse with Parse3339 (DateComponents) ╒════════════════════════════╤═════════╤═════════╤═════════╤═════════╤═════════╤═════════╤═════════╤═════════╕ │ Metric │ p0 │ p25 │ p50 │ p75 │ p90 │ p99 │ p100 │ Samples │ ╞════════════════════════════╪═════════╪═════════╪═════════╪═════════╪═════════╪═════════╪═════════╪═════════╡ │ Malloc (total) │ 64 │ 64 │ 64 │ 64 │ 64 │ 64 │ 67 │ 100000 │ ├────────────────────────────┼─────────┼─────────┼─────────┼─────────┼─────────┼─────────┼─────────┼─────────┤ │ Memory (resident peak) (M) │ 9 │ 41 │ 73 │ 105 │ 124 │ 136 │ 137 │ 100000 │ ├────────────────────────────┼─────────┼─────────┼─────────┼─────────┼─────────┼─────────┼─────────┼─────────┤ │ Throughput (# / s) (K) │ 43 │ 42 │ 42 │ 41 │ 38 │ 29 │ 4 │ 100000 │ ├────────────────────────────┼─────────┼─────────┼─────────┼─────────┼─────────┼─────────┼─────────┼─────────┤ │ Time (total CPU) (ns) │ 23416 │ 23999 │ 24223 │ 25007 │ 26463 │ 34751 │ 206791 │ 100000 │ ├────────────────────────────┼─────────┼─────────┼─────────┼─────────┼─────────┼─────────┼─────────┼─────────┤ │ Time (wall clock) (ns) │ 22833 │ 23423 │ 23631 │ 24383 │ 25807 │ 34335 │ 211458 │ 100000 │ ╘════════════════════════════╧═════════╧═════════╧═════════╧═════════╧═════════╧═════════╧═════════╧═════════╛ Parse with Parse3339 (Unix time) ╒════════════════════════════╤═════════╤═════════╤═════════╤═════════╤═════════╤═════════╤═════════╤═════════╕ │ Metric │ p0 │ p25 │ p50 │ p75 │ p90 │ p99 │ p100 │ Samples │ ╞════════════════════════════╪═════════╪═════════╪═════════╪═════════╪═════════╪═════════╪═════════╪═════════╡ │ Malloc (total) │ 0 │ 0 │ 0 │ 0 │ 0 │ 0 │ 0 │ 100000 │ ├────────────────────────────┼─────────┼─────────┼─────────┼─────────┼─────────┼─────────┼─────────┼─────────┤ │ Memory (resident peak) (K) │ 7831 │ 7852 │ 7864 │ 7864 │ 7864 │ 7864 │ 7864 │ 100000 │ ├────────────────────────────┼─────────┼─────────┼─────────┼─────────┼─────────┼─────────┼─────────┼─────────┤ │ Throughput (# / s) (K) │ 263 │ 252 │ 247 │ 242 │ 233 │ 183 │ 17 │ 100000 │ ├────────────────────────────┼─────────┼─────────┼─────────┼─────────┼─────────┼─────────┼─────────┼─────────┤ │ Time (total CPU) (ns) │ 4333 │ 4503 │ 4543 │ 4667 │ 4875 │ 6543 │ 48584 │ 100000 │ ├────────────────────────────┼─────────┼─────────┼─────────┼─────────┼─────────┼─────────┼─────────┼─────────┤ │ Time (wall clock) (ns) │ 3791 │ 3959 │ 4041 │ 4127 │ 4291 │ 5459 │ 56625 │ 100000 │ ╘════════════════════════════╧═════════╧═════════╧═════════╧═════════╧═════════╧═════════╧═════════╧═════════╛ ```
jinhoyoo/summarize_youtube_video_lecture
https://github.com/jinhoyoo/summarize_youtube_video_lecture
Summarize Youtube's script by chapter creater configured
# Youtube Script Summarizer ## Overview Youtube Script Summarizer is a tool that helps summarize YouTube scripts by chapter creators. It leverages the youtube-transcript-api, langchain, and OpenAI packages to achieve this. ## How it works The youtube-transcript-api package is used to extract the script from a YouTube video. The script is then processed using the langchain package to identify the language and translate it if necessary. Finally, the OpenAI package is used to generate a summary of the script for each chapter created. ## Usage 0. Clone this project and setup Python over v3.11.x. 1. Install dependency ``` bash $ pip install -r requirements.txt ``` 2. Open `summary.ipynb` - Create `.env` file and set `OPENAI_API_KEY` variable. - Edit the variables in the "Input variables" cell according to your requirements. - Provide the YouTube video ID (`youtube_video_id`) for which you want to generate a summary. - Specify the language of the subscription (`language`). - Copy and paste the time stamps and chapter names from the YouTube video description into the `chapter_part_in_description` variable. ``` Python # Youtube video ID youtube_video_id="MZQ6bc6mPAE" # Language of subscription language = "ko" # Officially no way to get chapter automatically, # so copy and paste the time stamp and chapter in description of Youtube video. chapter_part_in_description = """ 00:00 시작 05:24 해먹을 결심: 탄핵해야 하는 이유 10:55 댓글 읽어보기 24:27 인사조직론이란 무엇인가? 33:52 (게르만 모형) 왜 직무가 중요한가? 43:11 추미애의 직무인식 49:16 직무의 존재목적, 칸트의 인간관과 경영학적 응용 1:04:40 성과책임의 사회적 의미 (참고)직무의 3대 구성요소 1:06:05 직무개념의 부재 1:07:44 주진우와 양향자 1:11:52 추미애의 고백과 진심 1:35:09 왜 역량인가? 1:35:47 역량의 개념에 대한 이해 1:38:05 《성취예측모형》 프레임워크와 역량사전 1:41:48 진실한 리더십과 인재평가의 프레임워크 1:43:59 DANO 경영플랫폼 운용_리더십이란 무엇인가? 1:49:42 추미애에 대한 오해의 프레임과 추미애의 비전은 무엇인가? 1:53:21 이재명과 추미애 vs. 이낙연과 김진표 1:55:52 푸른 하늘을(김수영 시인, 1960.06.15.) 1:58:57 정리 """ ``` 3. Run notebook and you can get the `markdown_note.md`. - This file has the script and summary for each chapter. ## Trobleshooting - Please define '00:00:00' as start in `chapter_part_in_description`. ## FAQ - To-Do
Moexin/IPTV
https://github.com/Moexin/IPTV
MyIPTV
Auto Update IPTV in 2023-08-02 08:42:04 CST
renegadevi/nuxt-boilerplate
https://github.com/renegadevi/nuxt-boilerplate
A ready to use Nuxt 3 boilerplate. (w/ Tailwind, i18n, Pinia, GDPR, Dark mode, TypeScript etc.)
# Nuxt Boilerplate A ready to use Nuxt 3 boilerplate **Demo**: https://nuxt-boilerplate.onrender.com/ ![screenshot](https://github.com/renegadevi/nuxt-boilerplate/blob/e806bae6efec29a19579d7cf76ef683ffe980917/.github/screenshot.png) ![lighthouse](https://github.com/renegadevi/nuxt-boilerplate/blob/e806bae6efec29a19579d7cf76ef683ffe980917/.github/lighthouse.png) ## Setup ### Prerequisites: - [pnpm](https://pnpm.io/) - [mkcert](https://github.com/FiloSottile/mkcert) (Optional localhost HTTPS) ### Install dependencies: ```bash pnpm install ``` ### Optional: `.env` ```ini VITE_BASE_URL="https://localhost:3000/" ``` ### Optional: Generate certificate for HTTPS for localhost ```bash # mkdir certs cd certs mkcert localhost ``` ## Start local server ### Start development server ```bash # HTTP pnpm run dev # HTTPS pnpm run dev-https ``` ### Start production build server (HTTP) ```bash # HTTP pnpm run build;pnpm run preview # HTTPS pnpm run build;pnpm run preview-https ``` Check out the [deployment documentation](https://nuxt.com/docs/getting-started/deployment) for more information. ## Nuxt Modules included: - [eslint](https://nuxt.com/modules/eslint) - [pinia](https://nuxt.com/modules/pinia) - [tailwindcss](https://nuxt.com/modules/tailwindcss) - [i18n](https://nuxt.com/modules/i18n) - [device](https://nuxt.com/modules/device) - [devtools](https://nuxt.com/modules/devtools) - [image](https://nuxt.com/modules/image) - [google-fonts](https://nuxt.com/modules/google-fonts) - [color-mode](https://nuxt.com/modules/color-mode) - [icon](https://nuxt.com/modules/icon) - [cookie-control](https://nuxt.com/modules/cookie-control) ## Optional: Component Viewer If you are looking for a component viewer while building and designing components, I made this extension for nuxt. Drop in the `_components.vue` into the `Pages` directory, make sure you have a `blank.vue` in Layouts that's just a `<Slot />`, and go to `/_components` https://github.com/renegadevi/nuxt-component-viewer ![nuxt-component-viewer](https://raw.githubusercontent.com/renegadevi/nuxt-component-viewer/6ca97e434ccd52d0949712ca39420529d37b3b5b/.github/screenshot-1.png)
NowScott/IndWebIndex
https://github.com/NowScott/IndWebIndex
Individual Web Index.
# Individual Web Index [![data update][data-update-image]][data-update-url] [![GitHub stars][stars-image]][stars-url] [![GitHub forks][forks-image]][forks-url] [![license GPL-3.0][license-image]][license-url] [![commit-count-permonth][commit-image]][commit-url] [![contributors-count][contributors-image]][contributors-url] [![web][web-image]][web-url] [![web-en][web-en-image]][web-en-url] ## Star History [![Star History Chart][stars-history-image]][stars-url] [data-update-url]:https://github.com/NowScott/web_database/actions/workflows/scraping.yml "数据更新" [data-update-image]:https://img.shields.io/github/actions/workflow/status/NowScott/IndWebIndex/scraping.yml?label=data%20update [stars-url]:https://github.com/NowScott/IndWebIndex/stargazers "星标" [stars-image]: https://img.shields.io/github/stars/NowScott/IndWebIndex?label=Star [forks-url]: https://github.com/NowScott/IndWebIndex/forks "复刻" [forks-image]: https://img.shields.io/github/forks/NowScott/IndWebIndex?label=Fork [license-url]: https://opensource.org/license/gpl-3-0/ "许可证" [license-image]: https://img.shields.io/github/license/NowScott/IndWebIndex [commit-url]:https://github.com/NowScott/IndWebIndex/commits/main "提交" [commit-image]:https://img.shields.io/github/commit-activity/m/NowScott/IndWebIndex [contributors-url]:https://github.com/NowScott/IndWebIndex/graphs/contributors "贡献者" [contributors-image]:https://img.shields.io/github/contributors/NowScott/IndWebIndex [web-url]:https://www.nowscott.top "中文网页" [web-image]:https://img.shields.io/badge/%E7%BD%91%E9%A1%B5%E9%A2%84%E8%A7%88-%E4%B8%AD%E6%96%87-blue [web-en-url]:https://www.nowscott.top/en "English Web" [web-en-image]:https://img.shields.io/badge/Preview-EN-blue [stars-history-image]:https://api.star-history.com/svg?repos=NowScott/IndWebIndex&type=Date ## 1.Purpose of creation Long ago, there was a website in the early days of the internet called Yellow Pages, which contained an index of many other websites. Over time, it gradually evolved into search engines. However, in search engines, the results often lack accuracy when it comes to finding specific websites. Therefore, I still believe that everyone needs their own Yellow Pages. This personal Yellow Pages would include a collection of frequently used websites, along with the ability to add tags and brief descriptions for simple searching and browsing functionality. In the end, I decided to name this personal Yellow Pages as "Individual Web Index." ## 2.Website iteration From the time I had this idea until now, this website has gone through many iterations, and the URL has also changed several times. The current and final website is [nowscott.top](https://nowscott.top)。 In the beginning, the data file was manually entered directly, and every time I wanted to update the data content, I had to access the developer interface, make changes to the data file, and then upload it to GitHub. This made me feel very unfriendly. Therefore, in the latest version, I utilized the Notion API to store the data in Notion's database. I will explain the specific implementation details in the next section. ## 3.How to use the Notion API (The code for fetching data from Notion is not included in this project.) This idea originated from a video by an uploader on Bilibili, and the link is provided below: [【S1E3】用Notion当数据库写一个简单的API](https://www.bilibili.com/video/BV1gF411E7pV/?share_source=copy_web&vd_source=98c7014c35363c157a4fba4929dbda77) In this video, I learned how to use the Notion API to fetch data from Notion and display it on a web page. If you have any questions, you can watch the video for more information. To summarize, I utilized a website called Netlify to deploy a service. This service runs every time someone visits the domain I deployed, retrieves data from Notion, and returns it to the frontend. Once the frontend receives the data, it can be displayed on the webpage. This sounds great, right? However, this simple process takes around 7 to 8 seconds at most, which does not align with the vision of a fast-loading personal website. Therefore, I made further improvements in this area. ## 4.Some optimizations I cannot change the access speed of the Notion API, but what I can change is the way data is fetched. If we want to make modifications more convenient, we can store a copy of the data in a convenient location. When accessing the website, we can fetch this stored data, effectively resolving the issue of slow access speed. When the data is modified, the aforementioned convenient data copy will not be updated immediately because it cannot communicate directly with Notion. To address this, I leverage GitHub Actions to regularly update the stored data. This ensures that the data remains up-to-date. ## 5.Final Notes In summary, this project is a web-based personal website index that addresses the issue of slow access speed and ensures timely data updates. These optimizations make modifications easier and provide a more efficient and seamless user experience. Thank you for your interest in this project. If you have any questions or inquiries regarding this project, you can contact me via email at: [<i class="fa fa-envelope-o"></i> [email protected]](mailto:[email protected])
flare-ai/customGPT-template
https://github.com/flare-ai/customGPT-template
Mercury - Train your own custom GPT. Chat with any file, or website.
<h1 align="center">Mercury</h4> <h4 align="center"> <img width="800" src="https://github.com/jacobvo888/readme-assets/blob/main/merc-search.png" alt=""> </h4> <h4 align="center"> <img width="200" src="https://github.com/jacobvo888/readme-assets/blob/main/logo-down-indigo.webp" alt=""> </h4> ## Chat with any Document or Website > Train your own custom GPT - Train on specific websites that you define - Train on documents you upload - Builds on dialog with Chat History - Cites sources - [Perplexity](https://www.perplexity.ai/) style UI <h2 align="center"> <br> <img width="1000" src="https://github.com/jacobvo888/readme-assets/blob/main/merc-read-1.png" alt=""> </h2> #### Supported Files - [x] .pdf - [x] .docx - [x] .md - [x] .txt - [x] .png - [x] .jpg - [x] .html - [x] .json #### Coming Soon - [ ] .csv - [ ] .pptx - [ ] notion - [ ] next 13 app dir - [ ] vercel ai sdk ## Train #### 1. Upload: `/api/embed-file` - file is uploaded -> cleaned to plain text, and split into 1000-character documents. - OpenAI's embedding API is used to generate embeddings for each document using the "text-embedding-ada-002" model. - The embeddings are stored in a Pinecone namespace. #### 2. Scrape: `/api/embed-webpage` - Web pages are scraped using [cheerio](https://github.com/cheeriojs/cheerio), cleaned to plain text, and split into 1000-character documents. - OpenAI's embedding API is used to generate embeddings for each document using the "text-embedding-ada-002" model. - The embeddings are stored in a Pinecone namespace. <h4 align="center"> <br> <img width="1000" src="https://github.com/jacobvo888/readme-assets/blob/main/merc-dark-train.png" alt=""> </h4> ## Query #### Responding to queries: `/api/query` - A single embedding is generated from the user prompt. - The embedding is used to perform a similarity search against the vector database. - The results of the similarity search are used to construct a prompt for GPT-3. - The GTP-3 response is then streamed back to the user. <h4 align="center"> <br> <img width="1000" src="https://github.com/jacobvo888/readme-assets/blob/main/merc-2.gif" alt=""> </h4> ## Getting Started ### 1. Clone Repo and Install Deps To create a new project based on this template using [degit](https://github.com/Rich-Harris/degit): ```bash npx degit https://github.com/jacobvo888/ai-template ai-template ``` ```bash cd ai-template code . ``` - install dependencies ```bash npm i ``` ### 2. Set-up Pinecone - Visit [pinecone](https://pinecone.io/) to create a free tier account and from the dashboard. - Create a new Pinecone Index with Dimensions `1536` eg: <h4 align=""> <br> <img width="700" src="https://github.com/jacobvo888/readme-assets/blob/main/merc-pinecone.png" alt=""> </h4> - Copy your API key - Record your Enironment name ex: `us-central1-gcp` - Record your index name ex: `mercury` ### 3. Set-up OpenAi API - Visit [openai](https://platform.openai.com/account/api-keys) to create and copy your API key > You can find this in the OpenAI web portal under `API Keys` ### 4. Open the `.env.local` file and configure your environment ```bash cp .env.example .env.local ``` ```bash # OpenAI OPENAI_API_KEY="sk-xxxxxxxxxxxxxxxxxxxxxxxxxxxxxx" # Pinecone PINECONE_API_KEY="xxxxxxxx-xxxx-xxxx-xxxx-xxxxxxxxxx" PINECONE_ENVIRONMENT="us-central1-gcp" PINECONE_INDEX_NAME="mercury" ``` ### 5. Start the app ```bash npm run dev ``` Open http://localhost:3000 in your browser to view the app. ## Template Features - OpenAI API (for generating embeddings and GPT-3 responses) - Pinecone - Nextjs API Routes (Edge runtime) - streaming - Tailwind CSS - Fonts with `@next/font` - Icons from [Lucide](https://lucide.dev) - Dark mode with `next-themes` - Radix UI Primitives - Automatic import sorting with `@ianvs/prettier-plugin-sort-imports` <h2 align="center"> <br> <a href="https://github.com/jacobvo888/ai-template"><img width="700" src="https://github.com/jacobvo888/readme-assets/blob/main/merc-light.png" alt=""></a> </h2> ## Inspiration: > 🍴 Huge thanks to [@gannonh](https://github.com/gannonh) and [@mayooear](https://github.com/mayooear/gpt4-pdf-chatbot-langchain) for their fantastic work that helped inspire this template. - https://www.perplexity.ai/ - https://builtbyjesse.com/ - https://ui.shadcn.com/docs - https://meodai.github.io/poline/ - https://github.com/gannonh/gpt3.5-turbo-pgvector - https://github.com/vercel/examples/tree/main/solutions/ai-chatgpt ## How embeddings work: ChatGPT is a great tool for answering general questions, but it falls short when it comes to answering domain-specific questions as it often makes up answers to fill its knowledge gaps and doesn't cite sources. To solve this issue, this starter app uses embeddings coupled with vector search. This app shows how OpenAI's GPT-3 API can be used to create conversational interfaces for domain-specific knowledge. <b>Embeddings</b> are vectors of floating-point numbers that represent the "relatedness" of text strings. They are very useful for tasks like ranking search results, clustering, and classification. In text embeddings, a high cosine similarity between two embedding vectors indicates that the corresponding text strings are highly related. This app uses embeddings to generate a vector representation of a document and then uses vector search to find the most similar documents to the query. The results of the vector search are then used to construct a prompt for GPT-3, which generates a response. The response is then streamed back to the user.
sd-fabric/fabric
https://github.com/sd-fabric/fabric
null
# FABRIC: Personalizing Diffusion Models with Iterative Feedback [Paper](https://arxiv.org/abs/2307.10159) | [Website](https://sd-fabric.github.io/) | [Colab](https://colab.research.google.com/drive/1rWZ4jQHMvjc-l7xYAssa_OUOaAx3XDQT?usp=sharing) | [Gradio](https://colab.research.google.com/drive/12pFi6WAKASG18uH3UcxGMVI37e1pIwAz) FABRIC (Feedback via Attention-Based Reference Image Conditioning) is a technique to incorporate iterative feedback into the generative process of diffusion models based on StableDiffusion. This is done by exploiting the self-attention mechanism in the U-Net in order to condition the diffusion process on a set of positive and negative reference images that are to be chosen based on human feedback. 🚨 FABRIC plugin for SD WebUI (alpha version): https://github.com/dvruette/sd-webui-fabric ## Setup - Option 1: Install the repository as a pip-package (does not install dependencies, check `requirements.txt` for required dependencies): ```bash pip install git+https://github.com/sd-fabric/fabric.git ``` - Option 2: Clone the repository, create virtual environment and install the required packages as follows: ```bash python3 -m venv .venv # create new virtual environment source .venv/bin/activate # activate it pip install -r requirements.txt # install requirements pip install -e . # install current repository in editable mode ``` ## Usage The `fabric/single_round.py` script can be used to run a single round of (optionally) feedback-conditioned generation as follows: ```bash # 1st round (text-to-image w/o feedback) python fabric/single_round.py prompt="photo of a dog running on grassland, masterpiece, best quality, fine details" # 2nd round (text-to-image w/ feedback) python fabric/run_single.py \ prompt="photo of a dog running on grassland, masterpiece, best quality, fine details" \ liked="[outputs/images/2023-07-06/example_1_1.png]" \ disliked="[outputs/images/2023-07-06/example_1_3.png]" ``` Alternatively, the FABRIC generators can be used to incorporate iterative feedback in the generation process as follows: ```python from PIL import Image from fabric.generator import AttentionBasedGenerator from fabric.iterative import IterativeFeedbackGenerator def get_feedback(images) -> tuple[list[Image.Image], list[Image.Image]]: raise NotImplementedError("TODO: Implement your own function to select positive and negative feedback") base_generator = AttentionBasedGenerator("dreamlike-art/dreamlike-photoreal-2.0", torch_dtype=torch.float16) base_generator.to("cuda") generator = IterativeFeedbackGenerator(base_generator) prompt = "photo of a dog running on grassland, masterpiece, best quality, fine details" negative_prompt = "lowres, bad anatomy, bad hands, cropped, worst quality" for _ in range(4): images: list[Image.Image] = generator.generate(prompt, negative_prompt=negative_prompt) liked, disliked = get_feedback(images) generator.give_feedback(liked, disliked) generator.reset() ``` ## Evaluation To replicate the evaluation results, the provided evaluation scripts can be used as follows: ```bash # Experiment 1: Preference model-based feedback selection python fabric/evaluation/preference_model_feedback.py # Experiment 2: Target image-based feedback selection python fabric/evaluation/target_image_feedback.py ``` ## Citation ``` @misc{vonrutte2023fabric, title={FABRIC: Personalizing Diffusion Models with Iterative Feedback}, author={Dimitri von Rütte and Elisabetta Fedele and Jonathan Thomm and Lukas Wolf}, year={2023}, eprint={2307.10159}, archivePrefix={arXiv}, primaryClass={cs.CV} } ```
verytinydever/covid-19-bot-updater
https://github.com/verytinydever/covid-19-bot-updater
null
Bot that gives covid-19 update
crowsonkb/dice-mc
https://github.com/crowsonkb/dice-mc
DiCE: The Infinitely Differentiable Monte-Carlo Estimator
# dice-mc Implements [DiCE: The Infinitely Differentiable Monte-Carlo Estimator](https://arxiv.org/abs/1802.05098) in PyTorch. DiCE is a surrogate loss for the score function estimator (REINFORCE), an unbiased Monte Carlo estimator of the gradient and higher-order derivatives (Hessian etc.) of the expectation of the loss, where the computation graph contains nondifferentiable stochastic nodes, such as sampling from a categorical distribution. An example use case of DiCE is fine-tuning a large language model with a loss that depends on outputs sampled from it during training, as is done in [RLHF](https://en.wikipedia.org/wiki/Reinforcement_learning_from_human_feedback). ## Usage ```python import dice_mc.torch as dice ``` Stochastic nodes are represented as a tensor of the logprobs of the sampled actions. `logp_categorical()` and `sample_categorical()` create and return stochastic nodes for categorical distributions. `logp_categorical()` is used when you already have the actions that were sampled, and `sample_categorical()` is used when you need to sample actions from the logits. ```python logp_1, actions_1 = dice.sample_categorical(logits_1) ``` Cost nodes are represented as a tensor of DiCE surrogate losses. `cost_node()` creates and returns a cost node given a cost tensor and an iterable of stochastic nodes that the costs were causally influenced by. (Including additional stochastic nodes will increase the variance of the estimator, but will not introduce bias.) ```python losses_1 = dice.cost_node(losses_1, [logp_1]) ``` The output of `cost_node()` can be differentiated to propagate gradients to the stochastic nodes given in the second argument. DiCE surrogate losses, when autodifferentiated, produce correct Monte Carlo estimators of higher order derivatives as well. The forward pass values of the DiCE surrogate losses are not modified, so they can be printed or used in metrics. ```python losses = losses_1 + 0.1 * losses_2 loss = losses.mean() loss.backward() opt.step() ``` Baselines can be used to reduce the variance of the estimator. DiCE baseline terms are scalars with the value of zero that have the gradient of baseline subtraction. `EMABaseline` is a simple exponential moving average baseline. `EMABaseline` contains state which should be saved and loaded when checkpointing. ```python baseline = dice.EMABaseline().to(device) ... losses = losses + baseline(losses, [logp_1, logp_2, logp_3]) # All stochastic nodes loss = losses.mean() loss.backward() opt.step() ``` If you have batches of losses, you can use `batch_baseline_term()`, a DiCE version of the [REINFORCE with replacement baseline](https://openreview.net/forum?id=r1lgTGL5DE). It uses the mean of the other losses in the batch as the baseline for each loss in the batch. ### A note on batching If you are computing samples in batches where each batch item is independent of the others, you should provide a 1D tensor of losses, one per batch item, to `cost_node()` and only afterward take the mean along the batch dimension. This will result in a lower variance estimator whose variance decreases as you increase the batch size. This is because each stochastic node is secretly a batch of stochastic nodes, one per batch item, and each cost node is secretly a batch of cost nodes, one per batch item, and the cost nodes only depend on stochastic nodes with the same batch index, so they can be excluded to reduce variance. ### Making a stochastic node from a generation from a language model You can sample from an autoregressive language model and then, after the fact, create a stochastic node from the logits and the sampled actions. Note: for Hugging Face models, top-k is 50 by default which will make the sampled tokens diverge from the distribution given by the logits. You should set top-k to 0. ```python tokens = model.generate(..., do_sample=True, temperature=1.0, top_k=0) ``` After sampling, you can run the tokens through the model once with gradients enabled to get logits which require grad, and create a stochastic node from the logits and tokens: ```python outputs = model(tokens, attention_mask=attention_mask) logp = dice.logp_categorical(outputs.logits[:, prompt_len - 1 : -1], tokens[:, prompt_len:]) ``` The prompt should be excluded except for the logits for the last prompt token. The tokens should be shifted one position left so that each token lines up with the vector of logits it was sampled from. The log probability under the model of each *prefix* of the tokens (subsequence that contains the first token) is given by summing the log probabilities of the tokens in the prefix, so take `logp.cumsum(dim=1)` to get the log probability of each prefix. The "stochastic node" created by `logp_categorical()`, for an autoregressive sequence model, is secretly a sequence of stochastic nodes where each node is only causally influenced by nodes to its left. Again, you can use this fact to reduce the variance of the estimator: suppose one of your loss terms is the KL divergence from the logits of a reference model. The KL penalty for a given token is not affected by tokens to its right, so they can be excluded: ```python logq = dice.logp_categorical(outputs_ref.logits[:, prompt_len - 1 : -1], tokens[:, prompt_len:]) losses_kl = logp.detach() - logq.detach() logp_cumsum = torch.cumsum(logp, dim=1) losses_kl = dice.cost_node(losses_kl, [logp_cumsum]) ``` If you are using a batch size greater than 1, this snippet will also correctly follow the batching advice above.
arthurgeek/vaultwarden-fly-template
https://github.com/arthurgeek/vaultwarden-fly-template
Template to deploy Vaultwarden on Fly.io with websockets support and sqlite hourly backups
# Template for deploying [Vaultwarden] on [Fly.io] This is a template for deploying [Vaultwarden] on [Fly.io] with [Caddy](https://caddyserver.com) for [websockets] support and [supercronic](https://github.com/aptible/supercronic) for hourly [restic](https://restic.net) backups with e-mail failure notification via [msmtp](https://marlam.de/msmtp/). This uses a single fly machine, within Fly's [free allowance](https://fly.io/docs/about/pricing/#free-allowances). ## Usage You first need to create a new repo for your config, by clicking on the **Use this template** button on this page. Then, clone your new repo and `cd` into it. ### Install dependencies 1. Install [go-task](https://github.com/go-task/task): We use go-task to automate some steps, you can check the task code under [.taskfiles](.taskfiles). to see which commands each task run. ```sh brew install go-task/tap/go-task ``` 1. Install [flyctl](https://fly.io/docs/hands-on/install-flyctl/): ```sh brew install flyctl ``` ### Configuration The `.config.env` file contains environment variables needed to deploy the apps in this template. 1. Copy the `.config.sample.env` to `.config.env` and fill out all the environment variables. **All uncommented variables are required**. ### [Fly.io] setup For some commands below, we use a task instead of `flyctl` because we the task writes (on app creation) and reads (subsequent commands) your app name from the config file. This is the only way to keep your app name hidden. 1. Signup to Fly If you already have a Fly account, use `flyctl auth login` instead. ```sh flyctl auth signup ``` 1. Create a new fly app If this is your first app, you'll be asked to add credit card information, but, don't worry, you'll not be charged by this app. ```sh task fly:app:create ``` 1. Create a new volume This will show you a warning about invididual volumes. It's ok to have a single volume because we're not concerned about downtime for our Vaultwarden instance. ```sh task fly:volume:create ``` 1. Deploy your app ```sh task fly:app:deploy ``` 1. Setup your custom domain After your app is deployed, follow the steps [here](https://fly.io/docs/app-guides/custom-domains-with-fly/) to setup your custom domain. 1. Open your new Vaultwarden website That's all! Now you can open your custom domain and Vaultwarden should work. ## Keeping dependencies up to date This template uses [Renovatebot](https://www.mend.io/free-developer-tools/renovate/) to scan and open new PRs when dependencies are out of date. To enable this, open their [Github app](https://github.com/apps/renovate) page, click the "Configure" button, then choose your repo. The template already provides Renovate configs and there's no need for further action. ## Troubleshooting If your deployment failed or you can't open Vaultwarden web, you can see the logs with: ```sh task fly:app:logs ``` If that command fails (eg, if the machine is stopped), try opening your logs in the browser: ```sh task fly:app:logs:web ``` You can also ssh in the machine with: ```sh task fly:app:ssh ``` and check individual logs using [overmind](https://github.com/DarthSim/overmind): ```sh # Run this command inside your fly machine overmind connect vaultwarden ``` This will open a tmux window with vaultwarden logs. You can scroll your tmux window with `Ctrl-B-]` and use `Ctrl-B-D` to exit the tmux window. Substitute `vaultwarden` with `caddy`, or `backup` to see logs for other apps. ## Continuous deployment After your first manual deploy to Fly.io, per instructions above, you can setup continuous deployment via Github Actions. 1. Install [Github CLI](https://cli.github.com) ```sh brew install gh ``` 1. Login to Github ```sh gh auth login ``` 1. Set Fly secrets to your Github repo ```sh task github:secrets:set ``` 1. Test your workflow deployment ```sh task github:workflow:deploy ``` That's all! Now, any changes to your `Dockerfile`, `fly.toml` or `scripts`/`config` will trigger a fly deploy. ## FAQ 1. Why every `fly` command I run errors with: `Error: the config for your app is missing an app name`? For security reasons the app name is not sdaved in the [fly.toml] file. In that case, you have to add `-a your-app-name` to all `fly` commands. Your app name is found in your `.config.env` file. Example: ```sh fly secrets list -a your-app-name ``` Or you can add: ```yaml app = "your-app-name" ``` to the beginning of your [fly.toml] file. [Vaultwarden]: https://github.com/dani-garcia/vaultwarden [Fly.io]: https://fly.io [websockets]: https://github.com/dani-garcia/vaultwarden/wiki/Enabling-WebSocket-notifications [fly.toml]: fly.toml
kendryte/k230_sdk
https://github.com/kendryte/k230_sdk
Kendryte K230 SDK
# 嘉楠勘智K230 SDK [![Github repository](https://img.shields.io/badge/github-repository-blue?logo=github&style=plastic)](https://github.com/kendryte/k230_docs) [![Gitee repository](https://img.shields.io/badge/gitee-repository-blue?logo=gitee&style=plastic)](https://gitee.com/kendryte/k230_docs) [![GitHub release](https://img.shields.io/github/v/release/kendryte/k230_docs?color=brightgreen&display_name=tag&logo=github&style=plastic)](https://github.com/kendryte/k230_docs/releases) [TOC] ## K230相关资源链接 | 链接 | 说明 | | ------ | ---- | | [K230 SDK](https://github.com/kendryte/k230_sdk) | Kendryte K230 SDK Github Repo | |[K230 Docs](https://github.com/kendryte/k230_docs)| Kendryte K230 Docs Github Repo | | [K230 SDK Gitee mirror](https://gitee.com/kendryte/k230_sdk) | Kendryte K230 SDK Gitee mirror Repo | |[K230 Docs Gitee mirror](https://gitee.com/kendryte/k230_docs)| Kendryte K230 Docs Gitee mirror Repo | |[K230 SDK Release mirror](https://kendryte-download.canaan-creative.com/k230/release/sdk/k230_sdk.tar.gz)| 嘉楠勘智K230 SDK release k230_sdk.tar.gz下载 | |[Canaan Kendryte Forum](https://developer.canaan-creative.com/)| 嘉楠勘智开发者论坛 | ## K230芯片简介 K230芯片是嘉楠科技 Kendryte®系列AIoT芯片中的最新一代SoC产品。该芯片采用全新的多异构单元加速计算架构,集成了2个RISC-V高能效计算核心,内置新一代KPU(Knowledge Process Unit)智能计算单元,具备多精度AI算力,广泛支持通用的AI计算框架,部分典型网络的利用率超过了70%。 该芯片同时具备丰富多样的外设接口,以及2D、2.5D等多个标量、向量、图形等专用硬件加速单元,可以对多种图像、视频、音频、AI等多样化计算任务进行全流程计算加速,具备低延迟、高性能、低功耗、快速启动、高安全性等多项特性。 ## K230 EVB 硬件开发板简介 K230-USIP-LP3-EVB是针对AI芯片K230-USIP开发,以LPDDR3为内存而设计的测评板。意在通过全面的接口覆盖K230所支持的各项特色功能的同时,为客户提供参考设计以完成自主研发。 ![k230_board](https://kendryte-download.canaan-creative.com/k230/downloads/doc_images/k230_unsip_board.png) | 序号 | 说明 | |------------|------------------| | 1 | CKLINK调试 | | 2 | LCD扩展, 4lane MIPI DSI| | 3 | LPDDR3 512MB| | 4 | K230 SoC | | 5 | 32Mbit QSPI NOR Flash| | 6 | Sensor转接卡 (4Lane MIPI CSI)| | 7 | 4GB eMMC| | 8 | USB转串口FT2232| | 9 | SD卡座| | 10 | USB转二路串口| | 11 | USB0可接USB转以太网| | 12 | 复位按键| | 13 | 电源开关| | 14 | Boot启动开关| | 15 | USB 5V电源| ## K230 SDK软件开发快速指南 ### K230 SDK结构简介 | 一级目录| 二级目录 | 说明 | |--------|--------|------| | configs| NA | 资源配置 (内存分配规划) | | output | NA | SDK编译产物| | src | big | 大核RTSmart代码| | src | common | 大小和公共代码| | src | little | 小核Linux代码| | tools | docker | dockerfile | | tools | doxygen|doxygen脚本和配置文件 | | tools | kconfig| | | tools | gen_image.sh | 生成可烧写镜像的脚本 | | tools | gen_image_cfg| 镜像分区配置文件| | tools | tuning-tool-client| PC端图像调试工具 | K230 SDK 是面向K230 开发板的软件开发包,包含了基于Linux&RT-smart 双核异构系统开发需要用到的源代码,工具链和其他相关资源。 ![k230_software_arch](https://kendryte-download.canaan-creative.com/k230/downloads/doc_images/software_arch.png) ### 配置软件开发环境 K230 SDK需要在Linux环境下编译,推荐使用Ubuntu Liunx 20.04。 #### 使用docker编译环境 - 获取docker编译镜像 推荐在docker环境中编译K230 SDK,可直接使用如下docker镜像: ```shell docker pull ghcr.io/kendryte/k230_sdk ``` > 说明: docker镜像中默认不包含toolchain,下载源码后,使用命令'make prepare_sourcecode'命令会自动下载toolchain至编译目录中。 如果不使用docker编译环境,而是希望使用原生Linux进行编译,可参考`tools/docker/Dockerfile`,安装相应的工具至您的Linux系统中即可。 ### 编译K230 SDK #### 下载K230 SDK源码 ```shell git clone https://github.com/kendryte/k230_sdk cd k230_sdk source tools/get_download_url.sh && make prepare_sourcecode ``` > `make prepare_sourcecode` 会自动下载Linux和RT-Smart toolchain, buildroot package, AI package等. 下载时间和速度以实际网速为准。 #### 开始编译K230 SDK 以docker镜像编译为例: - 确认当前目录为`k230_sdk`源码根目录,使用如下命令进入docker并执行`make`命令,即可开始编译 ```shell docker run -u root -it -v $(pwd):$(pwd) -v $(pwd)/toolchain:/opt/toolchain -w $(pwd) k230_sdk /bin/bash make ``` > 外部目录中自动下载的toolchain会映射至docker镜像中的`/opt/toolchain/`目录下。 #### 编译产物简介 ``` shell k230_evb_defconfig/images ├── big-core ├── env.env ├── jffs2.env ├── little-core ├── sysimage-sdcard.img # SD和emmc非安全启动镜像 ├── sysimage-sdcard.img.gz # SD和emmc的非安全启动镜像压缩包 ├── sysimage-spinor32m.img # norflash非安全启动镜像 ├── sysimage-spinor32m.img.gz # norflash非安全启动镜像压缩包 └── sysimage-spinor32m_jffs2.img # norflash jffs2非安全启动镜像 ``` ### 烧录镜像文件 K230 支持SDCard、eMMC、norflash等多种启动方式,用户可以通过改变开板上启动拔码开关的设置,来切换不同启动模式。 为方便开发,建议您准备一张TF卡,并将拔码开关切换至SD卡启动模式,后续可考虑将镜像文件固化至emmc中。 #### 烧录TF卡 如使用Linux烧录TF卡,需要先确认SD卡在系统中的名称`/dev/sdx`, 并替换如下命令中的`/dev/sdx` ``` shell sudo dd if=sysimage-sdcard.img of=/dev/sdx bs=1M oflag=sync ``` 如使用Windows烧录, 建议使用[the balena Etcher](https://etcher.balena.io/)工具. ![Etcher](https://github.com/kendryte/k510_docs/raw/v1.5/zh/images/quick_start/balenaetcher.png) 其它更详细的烧录方法,请参考K230 SDK文档 ### 上电启动K230 EVB开发板 确认启动开关选择在SD卡启动模式下,将烧录完成的TF卡插入进开板板卡槽中,然后将电源开关K1拔到ON位置,系统可上电。如果您有接好串口,可在串口中看到启动日志输出。 **系统上电后,默认会有二个串口设备,可分别访问小核Linux和大核RTSmart** 大核RTSmart系统中会开机会自动启动一个应用程序,可按q键退出至命令提示符终端。 ## 贡献指南 如果您对本项目感兴趣,想要反馈问题或提交代码,请参考[CONTRIBUTING](.github/CONTRIBUTING.md) ## 联系我们 北京嘉楠捷思信息技术有限公司 网址:[canaan-creative.com](www.canaan-creative.com) 商务垂询:[[email protected]]([email protected])
dongyx/lnn
https://github.com/dongyx/lnn
A Command-Line Program of Feedforward Neural Networks
LNN === LNN (Little Neural Network) is a command-line program of feedforward neural networks aiming to **make easy tasks easily done**. The following Unix pipeline trains a network to sum real numbers. $ seq 1024 \ | awk '{x=rand(); y=rand(); print x,y,x+y}' \ | lnn train -Cq2i1i -i1024 >model.nn The trained model could be used as an addition calculator. $ echo 5 3 | lnn run -m model.nn 8.000000 The key features of LNN are list below. - Light weight, containing only a standalone executable; - Serve as a Unix filter; Easy to work with other programs; - Plain-text formats of models, input, output, and samples; - Compact notations; - Different activation functions for different layers; - L2 regularization; - Mini-batch training. **Table of Contents** - [Installation](#installation) - [Getting Started](#getting-started) - [Further Documentation](#further-documentation) - [Following News](#following-news) Installation ------------ It would be better to select a version from the [release page](https://github.com/dongyx/lnn/releases) than downloading the working code, unless you understand the status of the working code. The latest stable release is always recommended. $ make $ sudo make install By default, LNN is installed to `/usr/local`. You could call `lnn --version` to check the installation. Getting Started --------------- The following call of LNN creates a network with a 10-dimension input layer, a 5-dimension hidden layer with the sigmoid activation function, and a 3-dimension output layer with the softmax activation function. $ lnn train -C q10i5s3m samples.txt >model.nn The `-C` option creates a new model with the structure specified by the argument. The argument here is `q10i5s3m`. The first character `q` specifies the loss function to be the quadratic error. The following three strings `10i`, `5s`, `3m` represent that there are 3 layers, including the input layer, with dimensions 10, 5, 3, respectively. The character following each dimension specifies the activation function for that layer. Here `i` ,`s`, `m` represent the identity function, the sigmoid function, and the softmax function, respectively ([Further Documentation](#further-documentation)). In the remaining part of this chapter, the dimension of input is denoted by $n$, and the dimension of output is denoted by $m$. LNN reads samples from the file operand, or, by default, the standard input. The trained model is printed to the standard output in a text format. The sample file is a text file containing numbers separated by white characters (space, tab, newline). Each $n+m$ numbers constitute a sample. The first $n$ numbers of a sample constitute the input vector, and the remaining constitute the output vector. LNN supports many training arguments like learning rate, iteration count, and batch size ([Further Documentation](#further-documentation)). LNN could train a network based on an existed model by replacing `-C` with `-m`. $ lnn train -m model.nn samples.txt >model2.nn This allows one to observe the behaviors of the model in different stages and provide different training arguments. The `run` sub-command runs an existed model. $ lnn run -m model.nn input.txt LNN reads the input vectors from the file operand, or, by default, the standard input. The input shall contain numbers separated by white characters (space, tab, newline). Each $n$ numbers constitute an input vector. The output vector of each input vector is printed to the standard output. Each line contains an output vector. Components of an output vector are separated by a space. The `test` sub-command evaluates an existed model. $ lnn test -m model.nn samples.txt LNN reads samples from the file operand, or, by default, the standard input. The mean loss value of the samples is printed to the standard output. The format of the input file is the same as of the `train` sub-command. Further Documentation --------------------- - The [technical report](https://www.dyx.name/notes/lnn.html) serves as an extension of this read-me file. It contains more details and examples for understanding the design and usage. - Calling `lnn --help` prints a brief of the command-line options. Following News -------------- - [Author's homepage](https://www.dyx.name) - [Twitter](https://twitter.com/dongyx2)
saagarjha/Presentations
https://github.com/saagarjha/Presentations
Slides and resources for talks I've given
# Presentations Slides and resources for talks I've given
OpenGVLab/InternLMM
https://github.com/OpenGVLab/InternLMM
null
# InternLMM ![](multimodal_1.png) ![](multimodal_2.png)
hako-mikan/sd-webui-cd-tuner
https://github.com/hako-mikan/sd-webui-cd-tuner
Color/Detail control for Stable Diffusion web-ui
# CD(Color/Detail) Tuner Color/Detail control for Stable Diffusion web-ui/色調や書き込み量を調節するweb-ui拡張です。 [日本語](#使い方) Update 2023.07.13.0030(JST) - add brightness - color adjusting method is changed - add disable checkbox ![](https://raw.githubusercontent.com/hako-mikan/sd-webui-cd-tuner/imgs/sample0.png) This is an extension to modify the amount of detailing and color tone in the output image. It intervenes in the generation process, not on the image after it's generated. It works on a mechanism different from LoRA and is compatible with 1.X and 2.X series. In particular, it can significantly improve the quality of generated products during Hires.fix. ## Usage It automatically activates when any value is set to non-zero. Please be careful as inevitably the amount of noise increases as the amount of detailing increases. During the use of Hires.fix, the output might look different, so it is recommended to try with expected settings. Values around 5 should be good, but it also depends on the model. If a positive value is input, the detailing will increase. ### Detail1,2 Drawing/Noise Amount When set to negative, it becomes flat and slightly blurry. When set to positive, the detailing increases and becomes noisy. Even if it is noisy in normal generation, it might become clean with hires.fix, so be careful. Detail1 and 2 both have similar effects, but Detail1 seems to have a stronger effect on the composition. In the case of 2.X series, the reaction of Detail 1 may be the opposite of normal, with more drawings in negative. ![](https://raw.githubusercontent.com/hako-mikan/sd-webui-cd-tuner/imgs/sample8.png) ### Contrast: Contrast/Drawing Amount, Brightness Contrast and brightness change, and at the same time the amount of detailing also changes. It would be quicker to see the sample. The difference between Contrast 1 and Contrast 2 lies in whether the adjustment is made during the generation process or after the generation is complete. Making the adjustment during the generation process results in a more natural outcome, but it may also alter the composition. ![](https://raw.githubusercontent.com/hako-mikan/sd-webui-cd-tuner/imgs/sample7.png) ### Color1,2,3 Color Tone You can tune the color tone. For `Cyan-Red`, it becomes `Cyan` when set to negative and `Red` when set to positive. ![](https://raw.githubusercontent.com/hako-mikan/sd-webui-cd-tuner/imgs/sample6.png) ### Hr-Detail1,2 ,Hires-Scaling In the case of using Hires-fix, the optimal settings often differ from the usual. Basically, when using Hires-Fix, it is better to input larger values than when not using it. Hr-Detail1,2 is used when you want to set a different value from when not used during Hires-Fix generation. Hires-Scaling is a feature that automatically sets the value at the time of Hires-Fix. The value of Hires-scale squared is usually multiplied by the original value. ## Use in XYZ plot/API You can specify the value in prompt by entering in the following format. Please use this if you want to use it in XYZ plot. ``` <cdt:d1=2;col1=-3> <cdt:d2=2;hrs=1> <cdt:1> <cdt:0;0;0;-2.3;0,2> <cdt:0;0;0;-2.3;0;2;0;0;1> ``` The available identifiers are `d1,d2,con1,con2,bri,col1,col2,col3,hd1,hd2,hrs,st1,st2`. When describing in the format of `0,0,0...`, please write in this order. It is okay to fill in up to the necessary places. The delimiter is a semicolon (;). If you write `1,0,4`, `d1,d2,cont` will be set automatically and the rest will be `0`. `hrs` turns on when a number other than `0` is entered. This value will be prioritized if a value other than `0` is set. At this time, `Skipping unknown extra network: cdt` will be displayed, but this is normal operation. ### Stop Step You can specify the number of steps to stop the adjustment. In Hires-Fix, the effects are often not noticeable after the initial few steps. This is because in most samplers, a rough image is already formed within the first 10 steps. ## Examples of use The left is before use, the right is after use. Click the image to enlarge it. Here, we are increasing the amount of drawing and making it blue. The difference is clearer when enlarged. You can expect an improvement in reality with real-series models. ![](https://raw.githubusercontent.com/hako-mikan/sd-webui-cd-tuner/imgs/sample4.png) ![](https://raw.githubusercontent.com/hako-mikan/sd-webui-cd-tuner/imgs/sample5.png) # Color/Detail control for Stable Diffusion web-ui 出力画像の描き込み量や色調を変更する拡張機能です。生成後の画像に対してではなく生成過程に介入します。LoRAとは異なる仕組みで動いています。2.X系統にも対応しています。特にHires.fix時の生成品質を大幅に向上させることができます。 ## 使い方 どれかの値が0以外に設定されている場合、自動的に有効化します。描き込み量が増えると必然的にノイズも増えることになるので気を付けてください。Hires.fix使用時では出力が違って見える場合があるので想定される設定で試すことをおすすめします。数値は大体5までの値を入れるとちょうど良いはずですがそこはモデルにも依存します。正の値を入力すると描き込みが増えたりします。 ### Detail1,2 描き込み量/ノイズ マイナスにするとフラットに、そして少しぼけた感じに。プラスにすると描き込みが増えノイジーになります。通常の生成でノイジーでもhires.fixできれいになることがあるので注意してください。Detail1,2共に同様の効果がありますが、Detail1は2に比べて構図への影響が強く出るようです。2.X系統の場合、Detail 1の反応が通常とは逆になり、マイナスで書き込みが増える場合があるようです。 ![](https://raw.githubusercontent.com/hako-mikan/sd-webui-cd-tuner/imgs/sample8.png) ### Contrast : コントラスト/描き込み量 コントラストや明るさがかわり、同時に描き込み量も変わります。サンプルを見てもらった方が早いですね。 ![](https://raw.githubusercontent.com/hako-mikan/sd-webui-cd-tuner/imgs/sample7.png) ### Color1,2,3 色調 色調を補正できます。`Cyan-Red`ならマイナスにすると`Cyan`、プラスにすると`Red`になります。 ![](https://raw.githubusercontent.com/hako-mikan/sd-webui-cd-tuner/imgs/sample6.png) ### Hr-Detail1,2 ,Hires-Scaling Hires-fixを使用する場合、最適な設定値が通常とは異なる場合が多いです。基本的にはHires-Fix使用時には未使用時より大きめの値を入れた方が良い結果が得られます。Hr-Detail1,2ではHires-Fix生成時に未使用時とは異なる値を設定したい場合に使用します。Hires-Scalingは自動的にHires-Fix使用時の値を設定する機能です。おおむねHires-scaleの2乗の値が元の値に掛けられます。 ## XYZ plot・APIでの利用について promptに以下の書式で入力することでpromptで値を指定できます。XYZ plotで利用したい場合にはこちらを利用して下さい。 ``` <cdt:d1=2;col1=-3> <cdt:d2=2;hrs=1> <cdt:1> <cdt:0;0;0;-2.3;0,2> <cdt:0;0;0;-2.3;0;2;0;0;1> ``` 使用できる識別子は`d1,d2,con1,con2,bri,col1,col2,col3,hd1,hd2,hrs,st1,st2`です。`0,0,0...`の形式で記述する場合にはこの順に書いてください。区切りはセミコロン「;」です。記入は必要なところまでで大丈夫です。`1,0,4`なら自動的に`cont`までが設定され残りは`0`になります。`hrs`は`0`以外の数値が入力されるとオンになります。 `0`以外の値が設定されている場合にはこちらの値が優先されます。 このとき`Skipping unknown extra network: cdt`と表示されますが正常な動作です。 ### stop step 補正を停止するステップ数を指定できます。Hires-Fixでは最初の数ステップ以降は効果が感じられないことが多いです。大概のサンプラーで10ステップ絵までには大まかな絵ができあがっているからです。 ## 使用例 リアル系モデルでリアリティの向上が見込めます。 ![](https://raw.githubusercontent.com/hako-mikan/sd-webui-cd-tuner/imgs/sample4.png) ![](https://raw.githubusercontent.com/hako-mikan/sd-webui-cd-tuner/imgs/sample5.png)
okatu-loli/Baoding-Electricity-Tracker
https://github.com/okatu-loli/Baoding-Electricity-Tracker
null
# Baoding-Electricity-Tracker Baoding-Electricity-Tracker是一个基于Python编写的自动化程序,能够帮助用户查询河北省保定市的每日电费,同时支持通过飞书和ServerChan等平台发送电费通知。 ## 功能介绍 - **自动化查询**:自动定时登录网页,获取电费数据,无需人工干预。 - **数据API**:通过Flask Web服务器,提供API接口,可以随时查询当前电费。 - **阈值通知**:当电费低于用户设定的阈值时,程序会自动发送通知。 - **执行通知**:可以设定在每次查询执行后发送通知,无论电费是否达到阈值。 - **多平台支持**:目前支持飞书和ServerChan两种通知方式,未来可能会支持更多平台。 ## 使用方法 1. 修改配置文件 `config.ini` ,填入自己的账号密码、通知配置、[飞书](https://open.feishu.cn/document/client-docs/bot-v3/add-custom-bot)和[ServerChan](https://sct.ftqq.com/sendkey)的密钥。 2. 运行 `app.py` 文件,启动Web服务器。 3. 访问 `http://localhost:5000/electricity` 接口,获取最新的电费数据。 4. 也可以单独执行`main.py`文件,这将不会有定时任务和API的功能。 5. 设定好通知时间、阈值等,程序将按照这些配置进行工作。 ## 项目依赖 Python3.10 依赖安装: ```bash pip install -r requirements.txt ``` ## 注意事项 由于 `ddddocr` 项目长时间未更新,且其未明确指定 `Pillow` 库的版本,项目和最新版本的 `Pillow` 出现了兼容性问题。因此,**强烈建议您使用Python 3.10版本**来运行本项目,以避免可能出现的兼容性问题。 目前这个问题已经被第三方开发者修复,我们正在等待原作者合并PR。 ## 更新日志: 2023.07.14: 添加MQTT支持,实现多渠道消息推送,详见PR [#1](https://github.com/okatu-loli/Baoding-Electricity-Tracker/pull/1) 感谢[@sunshinenny](https://github.com/sunshinenny) 相关链接: - [ddddocr requirements](https://github.com/sml2h3/ddddocr/blob/master/ddddocr/requirements.txt) - [Pillow deprecations](https://pillow.readthedocs.io/en/stable/deprecations.html#constants) - [ddddocr Pull Request #126](https://github.com/sml2h3/ddddocr/pull/126) ## 开源许可证 本项目采用MIT许可证,欢迎自由使用、分发和修改。 ## 联系我们 如有使用问题或者建议,欢迎联系我。
zoogie/Kartminer7
https://github.com/zoogie/Kartminer7
Use Mario Kart 7 as a blue shell to take control of your 3DS
# Kartminer BETA ## Thanks - wwylele for [save3ds](https://github.com/wwylele/save3ds) Installing the exploit to extdata from just seedminer and a PC would be impossible without this tool. - PabloMK7 for [3dsropxploitinjector](https://github.com/PabloMK7/3dsropxploitinjector) which makes tricky exploits like this much less of a hassle. Do you know he is also the author of a different mk7 exploit [kartdlphax](https://github.com/PabloMK7/kartdlphax), and ALSO the famous mk7 plugin [CTGP-7](https://ctgp-7.github.io/)? - Yellows8 for this enduringly useful tool: [3ds_ropkit](https://github.com/yellows8/3ds_ropkit) - All my testers in the issue threads! ## What is this? This is a BETA stage exploit for Mario Kart 7. This is the 3rd public exploit of this extremely popular game, this time targeting the extdata savegame on the sd card. It works with both the digital and cartridge version of the game on app version 1.2. ## Bugs, how? The flaw was found by simply fuzzing the entire 1MB+ of extdata save file, creatively named "data.dat", with continuous, indexed u32s. This resulted in multiple crashes, with some contexts having my fuzzed data showing up in various registers, and near branch instructions. The wide scope of this fuzzing naturally crashed multiple threads, but those were pretty easily handled by spraying their stacks with sleeping gas :p The US version 1.2 of the source code has comments if you need more precise (and messy) info. ## Directions -- Prereqs: * UNHACKED USA, EUR, or JPN new3ds or old3ds. * An eShop, pack-in, or cartridge version of Mario Kart 7 (pack-in means it came with your system out of the box) * Mario Kart 7 needs to be updated to game version 1.2. If you've played online recently, this should already be the case. * Windows PC with sd reader -- Prep:<br> a. Update your 3DS to 11.17 <br> b. Do https://3ds.hacks.guide/seedminer.html but STOP it and continue here when you get your system's movable.sed file. Place movable.sed inside the resources folder.<br> c. Boot up your 3ds and start mk7 and tap "Mario Kart Channel" at the bottom and then "Streetpass List" also at the bottom. Then agree to all the prompts to get sd extdata, spotpass, streetpass, etc. initialized. Skip this if you've already done it of course.<br> d. Turn off Wifi on your 3DS. The reason is spotpass could download and erase the exploit payload with it on.<br> -- Optional (but highly recommended) Backing up your extdata save<br> a. Place your 3DS sd card into your PC.<br> b. Go inside the "backup" folder and double click the backup_???.bat script where ??? is your 3DS's region.<br> c. A folder like 306, 307, or 308 should appear in the backup folder if the backup worked. <br> After the exploit is finished and you've installed cfw, you may run the restore_???.bat to restore your old unhacked save.<br> Instructions: 1. Place your 3DS sd card into your PC. 2. In the Release folder, double click on the .bat that matches your 3DS's region. 3. Your hax upload worked if there is no red text and it says "finished" at the end. 4. Turn your 3ds on, boot mk7, and go to Mario Kart Channel -> Streetpass List -> Press A repeatedly (but slowly) to get past all the menu screens until the hax launches. You should see colors change on the bottom screen: Red -> Blue -> White. If you don't see those colors, and it just crashes (2-3 tries), then go to System Settings and change your language. Repeat language changes if you still can't get colored bottom screens after running exploit. If you do get those colors but it still crashes, just keep trying on that same language - you're almost there.<br> Note: JPN only has 1 language so language switching doesn't apply. 5. If successful, you should arrive at a black and white screen titled "3DS ROP xPloit Injector". You now have a choice of further exploits to install: unsafe_mode or menuhax67, but that's the subject of another tutorial! Enjoy! ## Q&A Q: Omg, online cheaters will thrive! How could you?<br> A: The user has to knowingly install the exploit to their sdmc extdata using seedminer and a PC tool, it's not RCE in any way.<br> Q: This could mess up my save!<br> A: While I offer backup procedures in the instructions, I can make no guarantees this exploit can't mess up the overall savegame state somehow. I have no idea how the extdata save (or spotpass/streetpass) interacts with the normal savegame, for instance. This is why I need testers, and why I am calling this a BETA! Let's find out!<br> Q: Why can't I just run kartdlphax?<br> A: Well you certainly can, it's a great exploit. In fact, I'd recommend kartdlphax over Kartminer7, but you do need a 2nd hacked 3ds for it. If you don't have a 2nd hacked 3ds, I think kartminer7 could work well for you (after the beta phase, of course).<br> Q: Hi, I'm a famous youtube content creator and I'm not sure whether to credit Zoogie when I make video tutorials, does he like to stay hidden?<br> A: I would like to be credited please, thanks. I'm not that secretive!<br>
nelson-liu/lost-in-the-middle
https://github.com/nelson-liu/lost-in-the-middle
Code and data for "Lost in the Middle: How Language Models Use Long Contexts"
# Lost in the Middle: How Language Models Use Long Contexts This repository contains accompanying material for [Lost in the Middle: How Language Models Use Long Contexts](https://arxiv.org/abs/2307.03172). ## Table of Contents - [Installation](#installation) - [Multi-Document Question Answering Experiments](#multi-document-question-answering-experiments) - [Multi-Document Question Answering Data](#multi-document-question-answering-data) * [Generating new multi-document QA data.](#generating-new-multi-document-qa-data) - [Key-Value Retrieval Experiments](#key-value-retrieval-experiments) - [Key-Value Retrieval Data](#key-value-retrieval-data) * [Generating new key-value retrieval data](#generating-new-key-value-retrieval-data) - [Testing Prompting Templates](#testing-prompting-templates) - [References](#references) ## Installation 1. Set up a conda environment ``` sh conda create -n lost-in-the-middle python=3.9 --yes conda activate lost-in-the-middle ``` 2. Install package and requirements ``` sh pip install -e . ``` 3. (optional) set up pre-commit hooks for development. ``` sh pre-commit install ``` ## Multi-Document Question Answering Experiments See [EXPERIMENTS.md](./EXPERIMENTS.md#multi-document-question-answering) for instructions to run and evaluate models on the multi-document QA task. ## Multi-Document Question Answering Data [`qa_data/`](./qa_data/) contains multi-document question answering data for the oracle setting (1 input document, which is exactly the passage the answers the question) and 10-, 20-, and 30-document settings (where 1 input passage answers the question, and the other passages do not contain an NQ-annotated answer). Each line of this gzipped file is in the following format: ``` sh { "question": "who got the first nobel prize in physics", "answers": [ "Wilhelm Conrad Röntgen" ], "ctxs": [ ... { "id": <string id, e.g., "71445">, "title": <string title of the wikipedia article that this passage comes from>, "text": <string content of the passage>, "score": <string relevance score, e.g. "1.0510446">, "hasanswer": <boolean, whether any of the values in the `answers` key appears in the text>, "original_retrieval_index": <int indicating the original retrieval index. for example, a value of 0 indicates that this was the top retrieved document>, "isgold": <boolean, true or false indicating if this chunk is the gold answer from NaturalQuestions> }, ... ], "nq_annotated_gold": { "title": <string title of the wikipedia article containing the answer, as annotated in NaturalQuestions>, "long_answer": "<string content of the paragraph element containing the answer, as annotated in NaturalQuestions>", "chunked_long_answer": "<string content of the paragraph element containing the answer, randomly chunked to approximately 100 words>", "short_answers": [ <string short answers, as annootated in NaturalQuestions> ] } } ``` ### Generating new multi-document QA data. 1. First, download Contriever retrieval results for each of the queries: ``` sh wget https://nlp.stanford.edu/data/nfliu/lost-in-the-middle/nq-open-contriever-msmarco-retrieved-documents.jsonl.gz ``` 2. Then, to generate examples with 20 total documents with the relevant documents at positions 0, 4, 9, 14, and 19, run: ``` sh for gold_index in 0 4 9 14 19; do python -u ./scripts/make_qa_data_from_retrieval_results.py \ --input-path nq-open-contriever-msmarco-retrieved-documents.jsonl.gz \ --num-total-documents 20 \ --gold-index ${gold_index} \ --output-path qa_data/nq-open-20_total_documents_gold_at_${gold_index}.jsonl.gz done ``` ## Key-Value Retrieval Experiments See [EXPERIMENTS.md](./EXPERIMENTS.md#key-value-retrieval) for instructions to run and evaluate models on the key-value retrieval task. ## Key-Value Retrieval Data [`kv_retrieval_data/`](./kv_retrieval_data/) contains multi-document question answering data for the oracle setting (1 input document, which is exactly the passage the answers the question) and 10-, 20-, and 30-document settings (where 1 input passage answers the question, and the other passages do not contain an NQ-annotated answer). Each line of this gzipped file is in the following format: ``` sh { "ordered_kv_records": [ ... [ "adde4211-888b-48e3-9dbe-66c66551b05f", "8bc3e90d-e044-4923-9a5d-7bf48917ed39" ], [ "2a8d601d-1d69-4e64-9f90-8ad825a74195", "bb3ba2a5-7de8-434b-a86e-a88bb9fa7289" ], [ "006b46ef-03fd-4380-938c-49cb03754370", "9faeacbe-1d0e-40da-a5db-df598a104880" ], ... ], "key": "2a8d601d-1d69-4e64-9f90-8ad825a74195", "value": "bb3ba2a5-7de8-434b-a86e-a88bb9fa7289" } ``` The `ordered_kv_records` is a list of `[key, value]` pairs. The `key` specifies a particular key to retrieve from `ordered_kv_records`, and the `value` lists its expected associated value. ### Generating new key-value retrieval data To generate new key-value retrieval data, use: ``` sh python -u ./scripts/make_kv_retrieval_data.py \ --num-keys 300 \ --num-examples 500 \ --output-path kv-retrieval_data/kv-retrieval-300_keys.jsonl.gz ``` ## Testing Prompting Templates Code for converting the examples into string prompts is in [`src/lost_in_the_middle/prompting.py`](./src/lost_in_the_middle/prompting.py). After following the installation instructions above, you can run tests with: ``` sh $ py.test tests/ ========================================= test session starts ========================================= platform linux -- Python 3.9.17, pytest-7.4.0, pluggy-1.2.0 rootdir: /home/nfliu/git/lost-in-the-middle collected 7 items tests/test_prompting.py ....... [100%] ========================================== 7 passed in 0.08s ========================================== ``` ## References Please consider citing our work if you found this code or our paper beneficial to your research. ``` @misc{liu-etal:2023:arxiv, author = {Nelson F. Liu and Kevin Lin and John Hewitt and Ashwin Paranjape and Michele Bevilacqua and Fabio Petroni and Percy Liang}, title = {Lost in the Middle: How Language Models Use Long Contexts}, note = {arXiv:2307.03172}, year = {2023} } ```
MrTalentDev/AtEM-backend
https://github.com/MrTalentDev/AtEM-backend
null
# AtEMkart Full-Stack AtEMkart with Admin Dashboard & Paytm Payment Gateway. ### How to run -> - #### clone the entire repo - #### cd AtEMkart - #### npm install - #### cd client - #### npm install - #### cd.. - #### npm run dev ### Node version: 16.20.0
maiqingqiang/go-orm-helper
https://github.com/maiqingqiang/go-orm-helper
🚀🚀 A GoLand plugin that automatically provides database field completion, tags, and generates Structs for writing ORM code. Supports: Gorm、Xorm、Beego、GoFrame, etc. ⭐️ Star to support our work!一个为了让你在 Goland 上写 ORM 能自动补全数据库字段、Tag、生成Struct 的插件。支持:Gorm、Xorm、Beego、GoFrame...)请给个 ⭐️Star 支持我们的工作吧!
English | [简体中文](./README-zh_CN.md) | [日本語](./README-ja_JP.md) | [한국어](./README-ko_KR.md) <div align="center"> <img src="https://blog.johnmai.top/go-orm-helper/src/main/resources/icons/icon64x64.svg" alt="Go ORM Helper"/> <h1 align="center">Go ORM Helper</h1> </div> <p align="center">A GoLand plugin that automatically provides database field autocompletion, tags, and generates Structs for writing ORM code. Supports: Gorm、Xorm、Beego、GoFrame, etc.<br/>⭐️ Star to support our work!</p> > Inspired by [Laravel Idea](https://plugins.jetbrains.com/plugin/13441-laravel-idea) & > [PhpStorm metadata](https://www.jetbrains.com/help/phpstorm/ide-advanced-metadata.html). When using ORM packages in Golang, > I noticed that some ORM function parameters are strings and the IDE does not support code completion. It becomes inconvenient > to write code when dealing with numerous fields. Previously, when I was writing PHP, I used similar plugins and found them > extremely useful. Hence, this plugin was developed. ## Features - [x] ORM Code Completion. - [x] Assisted code completion with @Model annotation. - [x] Assisted code completion with @Table annotation. - [ ] Assisted code completion for custom SQL. 🚧[WIP] - [x] SQL to Struct conversion. [Supported](./SUPPORTED.md#supported-sql-to-struct-conversion) - [x] Go ORM Tags Live Template. [Supported](./SUPPORTED.md#supported-orm-tags-live-template) - More features waiting to be discovered and improved... ## Supported ORM Code Completion. - [x] [Gorm](https://github.com/go-gorm/gorm) - [x] [Xorm](https://gitea.com/xorm/xorm) - [x] [GoFrame](https://github.com/gogf/gf) - [ ] [Beego](https://github.com/beego/beego) 🚧[WIP] - [ ] [sqlx](https://github.com/jmoiron/sqlx) 🚧[WIP] - [More Supported](./SUPPORTED.md) ## Usage ### Code Completion https://www.jetbrains.com/help/go/auto-completing-code.html#code-completion-for-functions ![guide.gif](assets%2Fguide.gif) ### Annotation If the plugin is not compatible with your syntax, you can use the @Model or @Table helper. ![annotation.gif](assets%2Fannotation.gif) The plugin will scan the Structs in your project. It is recommended to set the scanning scope. ![setting.png](assets%2Fsetting.png) ### Live Template ![live-template.gif](assets%2Flive-template.gif) ### SQL to Struct #### Paste ![paste.gif](assets%2Fpaste.gif) #### Menu Action Selected SQL -> Editor Popup Menu -> Go ORM Helper Tool -> SQL Convert Struct ![manual-sql-to-struct.png](assets%2Fmanual-sql-to-struct.png) <!-- Plugin description end --> ## Installation > Compatibility Range: > - GoLand — 2022.2+ > - IntelliJ IDEA Ultimate — 2022.2+ ### Install Plugin from Jetbrains Marketplace <a href="https://plugins.jetbrains.com/plugin/22173-go-orm-helper" target="_blank"> <img src="https://blog.johnmai.top/go-orm-helper/assets/installation_button.svg" height="52" alt="Get from Marketplace" title="Get from Marketplace"> </a> ### Install Plugin from Disk - Download Go ORM Helper Plugin [Releases](https://github.com/maiqingqiang/go-orm-helper/releases) - Install plugins guide: https://www.jetbrains.com/help/idea/managing-plugins.html
aras-p/UnityGPUTexCompression
https://github.com/aras-p/UnityGPUTexCompression
How to do DXT/BCn texture compression in Unity using compute shaders
# Realtime DXT/BCn compression, in Unity, on the GPU Small testbed to see how compute shaders can be used to do texture compression _on the GPU_ in Unity. ![Screenshot](/screenshot.png?raw=true "Screenshot") ### Outline of how to do GPU texture compression: 1. Input is any 2D texture (regular texture, render texture etc.) that the GPU can sample. 2. We'll need a temporary `RenderTexture` that is 4x smaller than the destination texture on each axis, i.e. each "pixel" in it is one BCn block. Format of the texture is `GraphicsFormat.R32G32_SInt` (64 bits) for DXT1/BC1, and `GraphicsFormat.R32G32B32A32_SInt` (128 bits) otherwise. We'll want to make it writable from a compute shader by setting `enableRandomWrite=true`. 3. Output is same size as input (plus any padding to be multiple-of-4 size) `Texture2D` using one of compressed formats (DXT1/BC1, DXT5/BC3 etc.). We only need it to exist on the GPU, so create Texture2D with `TextureCreationFlags.DontInitializePixels | TextureCreationFlags.DontUploadUponCreate` flags to save some time, and call `Apply(false, true)` on it; the last argument ditches the CPU side memory copy. 4. A compute shader reads input texture from step 1, does {whatever GPU texture compression you do}, and writes into the "one pixel per BCn block" temporary texture from step 2. 5. Now we must copy from temporary "one pixel per BCn block" texture (step 2) into actual destination texture (step 3). `Graphics.CopyTexture` or `CommandBuffer.CopyTexture` with just source and destination textures *will not work* (since that one checks "does width and height match", which they don't - they differ 4x on each axis). But, `Graphics.CopyTexture` (or CommandBuffer equivalent) that takes `srcElement` and `dstElement` arguments (zeroes for the largest mip level) *does work*! 7. Profit! 📈 ### What is in this project: Project is based on Unity 2022.3.4. There's one scene that renders things, compresses the rendered result and displays it on screen. The display on screen also shows the difference (multiplied 2x) between original and compressed, as well as alpha channel and difference of that between original and compressed. Actual GPU texture compressors are just code taken from external projects, under `GPUTexCompression/External`: * `AMD_Compressonator`: [AMD Compressonator](https://github.com/GPUOpen-Tools/compressonator/tree/master/cmp_core/shaders), rev 7d929e9 (2023 Jan 26), MIT license. BC1 and BC3 compression with a tunable quality level. * `FastBlockCompress`: [Microsoft Xbox ATG](https://github.com/microsoft/Xbox-ATG-Samples/tree/main/XDKSamples/Graphics/FastBlockCompress/Shaders), rev 180fa6d (2018 Dec 14), MIT license. BC1 and BC3 compression (ignores quality setting). It is extremely likely that better real-time compute shader texture compressors are possible, the two above are just the ones I found that were already written in HLSL. There's also [Betsy](https://github.com/darksylinc/betsy) but that one is written in GLSL, and possibly some others. This example is not so much about compressor itself, but rather "how to plug that into Unity". Timings for compression of 1280x720 image into BC3 format on several configurations I tried: | | GeForce 3080 Ti (D3D11, D3D12, Vulkan) | Apple M1 Max (Metal) | |:--- |:--- |:--- | |XDK | 0.01ms, RMSE 3.877, 2.006 | 0.01ms, RMSE 3.865, 1.994 | |AMD q<0.5 | 0.01ms, RMSE 3.562, 2.006 | 0.17ms, RMSE 3.563, 1.994 | |AMD q<0.8 | 0.01ms, RMSE 2.817, 2.006 | 0.83ms, RMSE 2.819, 1.994 | |AMD q<=1 | 3.10ms, RMSE 2.544, 1.534 | 117ms😲, RMSE 2.544, 1.524 | On Apple/Metal the AMD compressor at "high" quality level is _astonishingly_ slow when using the default (FXC) HLSL shader compiler. However, switching to a more modern DXC shader compiler `#pragma use_dxc metal` does not work at all, gives a "Error creating compute pipeline state: Compiler encountered an internal error" failure when the compute shader is actually used. Fun!
simonw/llm-mpt30b
https://github.com/simonw/llm-mpt30b
LLM plugin adding support for the MPT-30B language model
# llm-mpt30b [![PyPI](https://img.shields.io/pypi/v/llm-mpt30b.svg)](https://pypi.org/project/llm-mpt30b/) [![Changelog](https://img.shields.io/github/v/release/simonw/llm-mpt30b?include_prereleases&label=changelog)](https://github.com/simonw/llm-mpt30b/releases) [![Tests](https://github.com/simonw/llm-mpt30b/workflows/Test/badge.svg)](https://github.com/simonw/llm-mpt30b/actions?query=workflow%3ATest) [![License](https://img.shields.io/badge/license-Apache%202.0-blue.svg)](https://github.com/simonw/llm-mpt30b/blob/main/LICENSE) Plugin for [LLM](https://llm.datasette.io/) adding support for the [MPT-30B language model](https://huggingface.co/mosaicml/mpt-30b). This plugin uses [TheBloke/mpt-30B-GGML](https://huggingface.co/TheBloke/mpt-30B-GGML). The code was inspired by [abacaj/mpt-30B-inference](https://github.com/abacaj/mpt-30B-inference). ## Installation Install this plugin in the same environment as LLM. ```bash llm install llm-mpt30b ``` After installing the plugin you will need to download the ~19GB model file. You can do this by running: ```bash llm mpt30b download ``` ## Usage This plugin adds a model called `mpt30b`. You can execute it like this: ```bash llm -m mpt30b "Three great names for a pet goat" ``` The alias `-m mpt` works as well. You can pass the option `-o verbose 1` to see more verbose output - currently a progress bar showing any additional downloads that are made during execution. ## Development To set up this plugin locally, first checkout the code. Then create a new virtual environment: cd llm-mpt30b python3 -m venv venv source venv/bin/activate Now install the dependencies and test dependencies: pip install -e '.[test]' To run the tests: pytest
AGiXT/hub
https://github.com/AGiXT/hub
Hub for Open Source AGiXT Extensions, Chains, Prompts, and Agents.
[![GitHub](https://img.shields.io/badge/GitHub-Sponsor%20Josh%20XT-blue?logo=github&style=plastic)](https://github.com/sponsors/Josh-XT) [![PayPal](https://img.shields.io/badge/PayPal-Sponsor%20Josh%20XT-blue.svg?logo=paypal&style=plastic)](https://paypal.me/joshxt) [![Ko-Fi](https://img.shields.io/badge/Kofi-Sponsor%20Josh%20XT-blue.svg?logo=kofi&style=plastic)](https://ko-fi.com/joshxt) # AGiXT Hub [![GitHub](https://img.shields.io/badge/GitHub-AGiXT%20Core-blue?logo=github&style=plastic)](https://github.com/Josh-XT/AGiXT) [![GitHub](https://img.shields.io/badge/GitHub-AGiXT%20Hub-blue?logo=github&style=plastic)](https://github.com/AGiXT/hub) [![GitHub](https://img.shields.io/badge/GitHub-AGiXT%20Light%20Hub-blue?logo=github&style=plastic)](https://github.com/AGiXT/light-hub) [![GitHub](https://img.shields.io/badge/GitHub-AGiXT%20Streamlit%20Web%20UI-blue?logo=github&style=plastic)](https://github.com/AGiXT/streamlit) [![GitHub](https://img.shields.io/badge/GitHub-AGiXT%20Python%20SDK-blue?logo=github&style=plastic)](https://github.com/AGiXT/python-sdk) [![pypi](https://img.shields.io/badge/pypi-AGiXT%20Python%20SDK-blue?logo=pypi&style=plastic)](https://pypi.org/project/agixtsdk/) [![GitHub](https://img.shields.io/badge/GitHub-AGiXT%20TypeScript%20SDK-blue?logo=github&style=plastic)](https://github.com/AGiXT/typescript-sdk) [![npm](https://img.shields.io/badge/npm-AGiXT%20TypeScript%20SDK-blue?logo=npm&style=plastic)](https://www.npmjs.com/package/agixt) [![Discord](https://img.shields.io/discord/1097720481970397356?label=Discord&logo=discord&logoColor=white&style=plastic&color=5865f2)](https://discord.gg/d3TkHRZcjD) [![Twitter](https://img.shields.io/badge/Twitter-Follow_@Josh_XT-blue?logo=twitter&style=plastic)](https://twitter.com/Josh_XT) [![Logo](https://josh-xt.github.io/AGiXT/images/AGiXT-gradient-flat.svg)](https://josh-xt.github.io/AGiXT/) [AGiXT](https://github.com/Josh-XT/AGiXT) Hub for Open Source AGiXT Providers, Extensions, Chains, Prompts, and Agents. This repository will be heavy weight as it will be the hub for all open source AGiXT Hub Providers, Extensions, Chains, Prompts, and Agents. If you want a light weight hub, visit the [AGiXT Light Hub](https://github.com/AGiXT/light-hub) repository. ## Getting Started - If you don't already have AGiXT, [follow this link for instructions to set it up.](https://github.com/Josh-XT/AGiXT#quick-start-guide) - Modify your `.env` file for AGiXT to change the `AGIXT_HUB` variable, this should be `AGiXT/hub` if you want to use the Open Source AGiXT hub (this). If you want to use your own fork of AGiXT hub, change this to your username and the name of your fork. ## Hubs - [AGiXT/hub](https://github.com/AGiXT/hub) - Open Source AGiXT Hub - [AGiXT/light-hub](https://github.com/AGiXT/light-hub) - Light Weight AGiXT Hub (Less providers and extensions, but faster to load and more applicable to most people who might not need things like transformers.) ## More Documentation Want to know more about AGiXT? Check out our [documentation](https://josh-xt.github.io/AGiXT/) or [GitHub](https://github.com/Josh-XT/AGiXT) page.
ALX-SE-Algorithmia/Demo-Project
https://github.com/ALX-SE-Algorithmia/Demo-Project
Our first demo project we will be working on together, to test collaboration skills and solidify our collaboration methods for future projects. In this project, we build a basic C Calculator program.
# Demo Project - Calculator Welcome to our first demo project aimed at testing collaboration skills. In this project, we will build a basic calculator in C and any other programming language of your choice. The goal is to enhance your knowledge of the C programming language while demonstrating effective collaboration for future projects. ## Project Description The calculator program allows users to perform arithmetic operations by entering two numbers and an arithmetic operator. The program then calculates the result based on the provided operator. ## Features - User-friendly prompt: The program provides clear instructions and prompts to guide users in entering their arithmetic expressions. - Error handling: The program handles invalid input, such as non-numeric values or invalid operators, by displaying error messages and prompting the user to enter valid input. - Decimal precision: The program supports calculations with decimal numbers, allowing users to specify the number of decimal places to round the result or enter decimal numbers directly. ## Contributing Carefully follow the steps listed below to send us your contribution: 1. [Visit the Demo Project Repo](https://github.com/ALX-SE-Algorithmia/Demo-Project/) 2. Fork the repo by clicking the ‘Fork’ button at the top right. A snippet is provided below: ![image](https://github.com/ALX-SE-Algorithmia/Demo-Project/assets/104218489/a3787b66-bf7c-428d-81b5-920ad53cea93) 3. Clone your forked repo to your local machine: `git clone <your-forked-repo-url>` You can get the URL of your forked repo in a section of your forked repo that is green in color, like in the snippet above. Click the green ‘Code’ button up there and you shall get ‘your forked repo URL’ 4. Navigate into the cloned repo on your local machine: `cd <cloned-repo-folder>` 5. In the base of your cloned-forked repo, run the command below: `git remote add upstream https://github.com/ALX-SE-Algorithmia/Demo-Project.git` 6. Make sure that your forked repo’s main branch is up to date with the Organization’s repo main branch. Go to your forked repo’s main branch and click the ‘Sync fork’ button, if it is there on your repo. ![image](https://github.com/ALX-SE-Algorithmia/Demo-Project/assets/104218489/5937fb0c-c994-4e63-88eb-1f2cf28cfe82) 7. Create a new branch using the command below: `git checkout -b <your-branch-name>` It is most preferable that your branch name is your name. E.g: `git checkout -b dohoudaniel` 8. You can start creating your C file from here on. But first, create a directory in your name and store your solutions in there. * A command to do that is: `mkdir dohoudaniel` * Navigate to the directory you have created: `cd dohoudaniel` 9. Remember to add a non-empty `README` file to your directory. When you are done, with your solutions, you can add and commit your changes: `git add .` `git commit -m “Your commit message”` 10. You can now push your changes from your local machine to your branch on GitHub: `git push origin your_branch_name` E.g: `git push origin dohoudaniel` 11. Create a pull request from your branch on GitHub. 12. Create your pull request from there on. 13. The assigned code reviewer will review your code, and eventually, you might need to make some changes. Make sure you keep an eye on your email so as to be notified of updates on your pull request. 14. Please add a little explanation in the space provided there on what you have done. ## License This project is licensed under the MIT License. See the [LICENSE](LICENSE) file for more details. --- Enjoy coding! :man_technologist: :woman_technologist: :headphones: :white_heart:
riatelab/dicopal.js
https://github.com/riatelab/dicopal.js
Discrete color palettes (hundreds of them!) for JavaScript
# Dicopal **Di**screte **co**lor **pal**ettes (*hundreds of them!*) for JavaScript. ![palettes](./palettes.png) > Dicopal offers color palettes from: > - [Colorbrewer2](https://colorbrewer2.org/) > - [Fabio Crameri's Scientific Colour Maps](https://www.fabiocrameri.ch/colourmaps/) > - [CARTOColors](https://carto.com/carto-colors/) > - [cmocean](https://matplotlib.org/cmocean/) > - [Light & Bartlein](https://agupubs.onlinelibrary.wiley.com/doi/abs/10.1029/2004EO400002) > - [Matplotlib](https://bids.github.io/colormap/) > - [MyCarta](https://mycartablog.com/color-palettes/) > - [Tableau](https://help.tableau.com/current/pro/desktop/en-us/formatting_create_custom_colors.htm#hex-values-for-discontinued-color-palettes) > - [The Wes Anderson Palettes blog](https://wesandersonpalettes.tumblr.com/) > - [Masataka Okabe and Kei Ito's Color Universal Design (CUD) palette](https://jfly.uni-koeln.de/color/) ⏩ [Browse all the available palettes](https://observablehq.com/@mthh/hello-dicopal) ## Installation ### NPM Add the package to your project: ```bash npm install dicopal ``` ### CDN Add the script to your HTML: ```html <script src="https://unpkg.com/dicopal"></script> ``` ## Usage ### Get a palette, by name and number of colors ```javascript const pal = getPalette('Pastel', 4); // Returns the "Pastel" palette with 4 colors // { // "number": 4, // "type": "qualitative", // "name": "Pastel", // "id": "Pastel_4", // "colors": ["#66C5CC","#F6CF71","#F89C74","#DCB0F2"], // "provider": "cartocolors", // "url": "https://github.com/CartoDB/CartoColor/wiki/CARTOColor-Scheme-Names" // } ``` ### Get a palette colors, by name and number of colors ```javascript const cols = getColors('Pastel', 4); // Returns the "Pastel" palette with 4 colors // ["#66C5CC","#F6CF71","#F89C74","#DCB0F2"] ``` Colors can also be reversed: ```javascript const cols = getColors('Pastel', 4, true); // ['#DCB0F2', '#F89C74', '#F6CF71', '#66C5CC'] ``` ### List the existing palettes for a given number of colors ```javascript // Returns 135 instances of palette with 3 colors const palettes = getPalettes({ number: 3 }); ``` ### List the existing palettes for a given type (sequential, diverging, qualitative) ```javascript // Returns 160 instances of qualitative palettes const palettes = getPalettes({ type: 'qualitative' }); ``` ### List the existing palettes for a given provider (ColorBrewer, Tableau, etc.) ```javascript // Returns 265 instances of colorbrewer palettes const palettes = getPalettes({ provider: 'colorbrewer' }); ``` ### List the existing palettes for a given name (for example, 'Accent') ```javascript // Returns the 6 instances of the "Accent" palette const palettes = getPalettes({ name: 'Accent' }); ``` ### List the existing palettes that match a set of criteria ```javascript // Returns the 12 instances of the palettes that are qualitative and have 10 colors const palettes = getPalettes({ type: 'qualitative', number: 10 }); ``` ### All the palettes or more criteria When no argument is provided, the `getPalettes` function returns all the palettes: ```javascript // Returns the 1600 instances of palettes const allPalettes = getPalettes(); ``` You can then filter the palettes yourself by any combination of criteria: ```javascript // Only sequential and diverging palettes from all providers except colorbrewer // with between 3 and 12 colors const palettes = allPalettes .filter((p) => ( ['sequential', 'diverging'].includes(p.type) && p.provider !== 'colorbrewer' && p.number >= 3 && p.number <= 12) ); ``` ### List the existing providers ```javascript const providers = getPaletteProviders(); // Returns the 10 providers ``` ### List the existing types ```javascript const providers = getPaletteTypes(); // Returns the 3 types ``` ### List the existing palette names ```javascript // Returns the 179 names ('ArmyRose', 'BrBg', 'Accent', etc.) const providers = getPaletteNames(); ``` ```javascript // Returns the 35 names ('BrBg', 'PRGn', etc.) const providers = getPaletteNames('colorbrewer'); ``` ### Not a fan of the proposed API ? Just get the raw description of the palettes and use them as you wish For a given provider: ```javascript getRawData('colorbrewer'); ``` For all the provider (default): ```javascript getRawData(); ``` ## Other information Palette information is stored in the `src/palette.json` file. It is [generated in Python](./generate-palette-descriptions.py) from various sources, notably the [palettable](https://github.com/jiffyclub/palettable) Python library (authored by [Matt Davis](https://github.com/jiffyclub)) and the [dicopal](https://lig-tdcge.imag.fr/steamer/dicopal/index.html) RDF vocabulary which both provide a list of palettes with their colors and metadata. ## License Apache-2.0. See [LICENSE](./LICENSE) for details.
zideliu/StyleDrop-PyTorch
https://github.com/zideliu/StyleDrop-PyTorch
Unoffical implement for [StyleDrop](https://arxiv.org/abs/2306.00983)
# StyleDrop <p align="left"> <a href="https://huggingface.co/spaces/zideliu/styledrop"><img alt="Huggingface" src="https://img.shields.io/badge/%F0%9F%A4%97%20Hugging%20Face-StyleDrop-orange"></a> <a href="https://replicate.com/cjwbw/styledrop"><img src="https://replicate.com/cjwbw/styledrop/badge"></a> <a href="https://colab.research.google.com/github/neild0/StyleDrop-PyTorch-Interactive/blob/main/styledrop_colab.ipynb"><img src="https://colab.research.google.com/assets/colab-badge.svg"></a> </p> The code of StyleDrop-PyTorch has been moved to [**aim-uofa/StyleDrop-PyTorch**](https://github.com/aim-uofa/StyleDrop-PyTorch). Please try it and have fun! This is an unofficial PyTorch implementation of [StyleDrop: Text-to-Image Generation in Any Style](https://arxiv.org/abs/2306.00983). Unlike the parameters in the paper in (Round 1), we set $\lambda_A=2.0$, $\lambda_B=5.0$ and `d_prj=32`, `is_shared=False`, which we found work better, these hyperparameters can be seen in `configs/custom.py`. we release them to facilitate community research. ![result1](img/1.png) <br/><br/> ![result2](img/2.png) <br/><br/> ![result3](img/3.png) <br/><br/> ![result4](img/4.png) <br/><br/> ![result5](img/5.png) ## News - [07/11/2023] The code of StyleDrop-PyTorch has been moved to [**aim-uofa/StyleDrop-PyTorch**](https://github.com/aim-uofa/StyleDrop-PyTorch). Please try it and have fun! - [07/06/2023] Online Gradio Demo is available [here](https://huggingface.co/spaces/zideliu/styledrop) ## Todo List - [x] Release the code. - [x] Add gradio inference demo (runs in local). - [ ] Add iterative training (Round 2). ## Data & Weights Preparation First, download VQGAN from this [link](https://drive.google.com/file/d/13S_unB87n6KKuuMdyMnyExW0G1kplTbP/view) (from [MAGE](https://github.com/LTH14/mage), thanks!), and put the downloaded VQGAN in `assets/vqgan_jax_strongaug.ckpt`. Then, download the pre-trained checkpoints from this [link](https://huggingface.co/nzl-thu/MUSE/tree/main/assets/ckpts) to `assets/ckpts` for evaluation or to continue training for more iterations. finally, prepare empty_feature by runnig command `python extract_empty_feature.py` And the final directory structure is as follows: ``` . ├── assets │ ├── ckpts │ │ ├── cc3m-285000.ckpt │ │ │ ├── lr_scheduler.pth │ │ │ ├── nnet_ema.pth │ │ │ ├── nnet.pth │ │ │ ├── optimizer.pth │ │ │ └── step.pth │ │ └── imagenet256-450000.ckpt │ │ ├── lr_scheduler.pth │ │ ├── nnet_ema.pth │ │ ├── nnet.pth │ │ ├── optimizer.pth │ │ └── step.pth │ ├── fid_stats │ │ ├── fid_stats_cc3m_val.npz │ │ └── fid_stats_imagenet256_guided_diffusion.npz │ ├── pipeline.png | ├── contexts │ │ └── empty_context.npy └── └── vqgan_jax_strongaug.ckpt ``` ## Dependencies Same as [MUSE-PyTorch](https://github.com/baaivision/MUSE-Pytorch). ``` conda install pytorch torchvision torchaudio cudatoolkit=11.3 pip install accelerate==0.12.0 absl-py ml_collections einops wandb ftfy==6.1.1 transformers==4.23.1 loguru webdataset==0.2.5 gradio ``` ## Train All style data in the paper are placed in the data directory 1. Modify `data/one_style.json` (It should be noted that `one_style.json` and `style data` must be in the same directory), The format is `file_name:[object,style]` ```json {"image_03_05.jpg":["A bear","in kid crayon drawing style"]} ``` 2. Training script as follows. ```shell #!/bin/bash unset EVAL_CKPT unset ADAPTER export OUTPUT_DIR="output_dir/for/this/experiment" accelerate launch --num_processes 8 --mixed_precision fp16 train_t2i_custom_v2.py --config=configs/custom.py ``` ## Inference The pretrained style_adapter weights can be downloaded from [🤗 Hugging Face](https://huggingface.co/zideliu/StyleDrop/tree/main). ```shell #!/bin/bash export EVAL_CKPT="assets/ckpts/cc3m-285000.ckpt" export ADAPTER="path/to/your/style_adapter" export OUTPUT_DIR="output/for/this/experiment" accelerate launch --num_processes 8 --mixed_precision fp16 train_t2i_custom_v2.py --config=configs/custom.py ``` ## Gradio Demo Put the [style_adapter weights](https://huggingface.co/zideliu/StyleDrop/tree/main) in `./style_adapter` folder and run the following command will launch the demo: ```shell python gradio_demo.py ``` The demo is also hosted on [HuggingFace](https://huggingface.co/spaces/zideliu/styledrop). ## Citation ```bibtex @article{sohn2023styledrop, title={StyleDrop: Text-to-Image Generation in Any Style}, author={Sohn, Kihyuk and Ruiz, Nataniel and Lee, Kimin and Chin, Daniel Castro and Blok, Irina and Chang, Huiwen and Barber, Jarred and Jiang, Lu and Entis, Glenn and Li, Yuanzhen and others}, journal={arXiv preprint arXiv:2306.00983}, year={2023} } ``` ## Acknowlegment * The implementation is based on [MUSE-PyTorch](https://github.com/baaivision/MUSE-Pytorch) * Many thanks for the generous help from [Zanlin Ni](https://github.com/nzl-thu) ## Star History <img src="https://api.star-history.com/svg?repos=zideliu/StyleDrop-PyTorch&type=Date">
m1guelpf/threads-re
https://github.com/m1guelpf/threads-re
Reverse-engineering Instagram's Threads private APIs.
# How Threads Works This repository contains my notes and discoveries while reverse-engineering Threads app. Feel free to PR if you've found something new, or to build clients with this info (with credit ofc 😉). ## Web (threads.net) The web version of Threads is currently read-only, so not much can be learned about authentication or posting. It uses Meta's [Relay GraphQL Client](https://relay.dev) to talk to the backend (`threads.net/api/graphql`), which seems to be configured to disallow arbitrary queries. This leaves us limited to the existing queries found in the frontend's source: > **Note** > When querying the GraphQL backend, make sure to set an user-agent (seems like anything works here) and set the `x-ig-app-id` header to `238260118697367`. ### Get profile data > Doc ID: `23996318473300828` > > Variables: `userID` (the user's ID) ```bash curl --request POST \ --url https://www.threads.net/api/graphql \ --header 'user-agent: threads-client' \ --header 'x-ig-app-id: 238260118697367' \ --header 'content-type: application/x-www-form-urlencoded' \ --data 'variables={"userID":"314216"}' \ --data doc_id=23996318473300828 ``` ### Get profile posts > Doc ID: `6232751443445612` > > Variables: `userID` (the user's ID) ```bash curl --request POST \ --url https://www.threads.net/api/graphql \ --header 'user-agent: threads-client' \ --header 'x-ig-app-id: 238260118697367' \ --header 'content-type: application/x-www-form-urlencoded' \ --data 'variables={"userID":"314216"}' \ --data doc_id=6232751443445612 ``` ### Get profile replies > Doc ID: `6307072669391286` > > Variables: `userID` (the user's ID) ```bash curl --request POST \ --url https://www.threads.net/api/graphql \ --header 'user-agent: threads-client' \ --header 'x-ig-app-id: 238260118697367' \ --header 'content-type: application/x-www-form-urlencoded' \ --data 'variables={"userID":"314216"}' \ --data doc_id=6307072669391286 ``` ### Get a post > Doc ID: `5587632691339264` > > Variables: `postID` (the post's ID) ```bash curl --request POST \ --url https://www.threads.net/api/graphql \ --header 'user-agent: threads-client' \ --header 'x-ig-app-id: 238260118697367' \ --header 'content-type: application/x-www-form-urlencoded' \ --data 'variables={"postID":"3138977881796614961"}' \ --data doc_id=5587632691339264 ``` ### Get a list of users who liked a post > Doc ID: `9360915773983802` > > Variables: `mediaID` (the post's ID) ```bash curl --request POST \ --url https://www.threads.net/api/graphql \ --header 'user-agent: threads-client' \ --header 'x-ig-app-id: 238260118697367' \ --header 'content-type: application/x-www-form-urlencoded' \ --data 'variables={"mediaID":"3138977881796614961"}' \ --data doc_id=9360915773983802 ``` ## Mobile Apps ### Authentication > **Warning** > This endpoint currently only works for accounts without 2FA enabled. The mobile apps use Meta's Bloks framework ([originally built for Instagram Lite](https://thenewstack.io/instagram-lite-is-no-longer-a-progressive-web-app-now-a-native-app-built-with-bloks/)) for authentication. The bloks versioning ID for threads is `00ba6fa565c3c707243ad976fa30a071a625f2a3d158d9412091176fe35027d8`. Bloks also requires you to provide a device id (of shape `ios-RANDOM` | `android-RANDOM`, `RANDOM` being a random set of 13 chars). ```bash curl --request POST \ --url 'https://i.instagram.com/api/v1/bloks/apps/com.bloks.www.bloks.caa.login.async.send_login_request/' \ --header 'user-agent: Barcelona 289.0.0.77.109 Android' \ --header 'sec-fetch-site: same-origin' \ --header 'content-type: application/x-www-form-urlencoded; charset=UTF-8' \ --data 'params={"client_input_params":{"password":"$PASSWORD","contact_point":"$USERNAME","device_id":"$DEVICE_ID"},"server_params":{"credential_type":"password","device_id":"$DEVICE_ID"}}' \ --data 'bloks_versioning_id=00ba6fa565c3c707243ad976fa30a071a625f2a3d158d9412091176fe35027d8' ``` This request returns a big JSON payload. Your token will be immediately after the string `Bearer IGT:2:`, and should be 160 characters long. ### Creating a text post ```bash curl --request POST \ --url 'https://i.instagram.com/api/v1/media/configure_text_only_post/' \ --header 'content-type: application/x-www-form-urlencoded; charset=UTF-8' \ --header 'user-agent: Barcelona 289.0.0.77.109 Android' \ --header 'authorization: Bearer IGT:2:$TOKEN' \ --header 'sec-fetch-site: same-origin' \ --data 'signed_body=SIGNATURE.{"publish_mode":"text_post","text_post_app_info":"{\"reply_control\":0}","timezone_offset":"0","source_type":"4","_uid":"$USER_ID","device_id":"$DEVICE_ID","caption":"$POST_TEXT","device":{"manufacturer":"OnePlus","model":"ONEPLUS+A3003","android_version":26,"android_release":"8.1.0"}}' ``` ## Misc ### How to get a profile's id from their username? Threads uses the same ID system used by Instagram. The best approach to convert from username to id seems to be requesting the user's instagram page (`instagram.com/:username`) and manually parsing the response HTML. For other methods, see [this StackOverflow question](https://stackoverflow.com/questions/11796349/instagram-how-to-get-my-user-id-from-username).
Helixo32/NimBlackout
https://github.com/Helixo32/NimBlackout
Kill AV/EDR leveraging BYOVD attack
# NimBlackout [![Nim Version](https://img.shields.io/badge/nim-1.6.8-orange.svg)](https://nim-lang.org/) > **Note**: This project is for educational purposes only. The use of this code for any malicious activity is strictly prohibited. I am not responsible for any misuse of this software. NimBlackout is an adaptation of the [@Blackout](https://github.com/ZeroMemoryEx/Blackout) project originally developed in C++ by [@ZeroMemoryEx](https://github.com/ZeroMemoryEx), which consists of removing AV/EDRs using the gmer (BYOVD) driver. The main reason for this project was to understand how BYOVD attacks work, and then to provide a valid PoC developed in Nim. All credit must goes to the original author [@ZeroMemoryEx](https://github.com/ZeroMemoryEx). # Usage - Compilation - Linux ``` nim --os:windows --cpu:amd64 --gcc.exe:x86_64-w64-mingw32-gcc --gcc.linkerexe:x86_64-w64-mingw32-gcc c NimBlackout.nim ``` - Windows ``` nim c NimBlackout.nim ``` - Put Blackout.sys driver into current directory - Launch NimBlackout (with admin privileges) ``` NimBlackout.exe <process name> ``` In order to prevent restarting process (like MsMpEng.exe), keep the program running. # Demo ![](https://github.com/Helixo32/NimBlackout/blob/main/Github_CrimsonKiller.gif)
Xmingbai/Minisforum-UM560-UM580-APU-Hackintosh
https://github.com/Xmingbai/Minisforum-UM560-UM580-APU-Hackintosh
null
# Minisforum-UM560XT/UM560/UM580-Hackintosh-APU 基于小兵大佬的引导修改而来:https://github.com/daliansky/minisforum-UM560XT-Hackintosh B站测评链接:https://www.bilibili.com/video/BV13X4y1W7ru # 使用不同config可适用于Minisforum UM560/UM560XT/UM580(下载看右边界面的Releases) ![](https://github.com/Xmingbai/Minisforum-UM560XT-APU-Hackintosh/blob/main/EFI.png) 默认config.plist可适用于UM560XT UM560-config.plist可适用于UM560 UM580config.plist可适用于UM580 # macOS完善程度 USB 2.5G网口 WIFI 蓝牙 声卡 (不支持显示器音频) HDMI/Type-C 显示都支持4k高刷显示器 睡眠及唤醒 # BIOS设置 关闭security boot 显存大小手动设置为2G/4G,推荐4G ![](https://github.com/Xmingbai/Minisforum-UM560XT-APU-Hackintosh/blob/main/13.png) ![](https://github.com/Xmingbai/Minisforum-UM560XT-APU-Hackintosh/blob/main/14.png) ![](https://github.com/Xmingbai/Minisforum-UM560XT-APU-Hackintosh/blob/main/R23.png) ![](https://github.com/Xmingbai/Minisforum-UM560XT-APU-Hackintosh/blob/main/Vega7-OpenCL-Metal.png) ![](https://github.com/Xmingbai/Minisforum-UM560XT-APU-Hackintosh/blob/main/HDMI-4K120.png) ![](https://github.com/Xmingbai/Minisforum-UM560XT-APU-Hackintosh/blob/main/Type-C-4K95.png) ![](https://github.com/Xmingbai/Minisforum-UM560XT-APU-Hackintosh/blob/main/AX200.png)
m1guelpf/threads-api
https://github.com/m1guelpf/threads-api
Reverse-engineered Rust client for Instagram's Threads app.
# Instagram Threads API [![Crates.io](https://img.shields.io/crates/v/threads-api.svg)](https://crates.io/crates/threads-api) [![Docs.rs](https://docs.rs/threads-api/badge.svg)](https://docs.rs/threads-api) [![MIT licensed](https://img.shields.io/badge/license-MIT-blue.svg)](LICENSE) > Unofficial, Reverse-Engineered Rust client for Instagram's [Threads](https://threads.net). ## Usage ```rust use threads_api::Threads; let client = Threads::new(); let user = client.profile(user_id).await?; let posts = client.posts(user_id).await?; let posts = client.replies(user_id).await?; ``` ## 📌 Roadmap - [x] Get user profile - [x] Get user posts - [x] Get user replies - [x] Get post replies - [x] Get post likes - [ ] Authentication - [ ] Post a thread - [ ] Post a reply - [ ] Update profile details - [ ] Follow a user - [ ] Unfollow a user ## License This project is open-sourced under the MIT license. See [the License file](LICENSE) for more information.
surprisetalk/blogs.hn
https://github.com/surprisetalk/blogs.hn
tiny directory of tech blogs
# [blogs.hn](https://blogs.hn) [blogs.hn](https://blogs.hn) is for _personal_ blogs! Non-technical blogs are okay too, but remember that the community is mostly folks from [HackerNews](https://news.ycombinator.com). ## Adding Blogs This repo comes with a helper script `fetch.js` to automatically grab blog info from a URL. But feel free to manually add/edit information to `blogs.json`! 1. Run the script: ```bash npm init -y && npm i axios cheerio node fetch.js "https://taylor.town" "https://gwern.net" ``` 2. Clean the output: ```json [ { "url": "https://taylor.town", "title": "Taylor Troesh", "about": "https://taylor.town/about", "now": "https://taylor.town/now", "feed": "https://taylor.town/feed.xml", "desc": "🐸🐸🐸🐸🐸🐸🐸🐸🐸🐸🐸🐸🐸🐸", "hn": [ { "created_at": "2023-04-04T11:42:33.000Z", "title": "How to be a -10x Engineer", "url": "https://taylor.town/-10x", "points": 875, "comments": 494, "id": "35438068" } ] }, { "url": "https://gwern.net", "title": "Essays · Gwern.net", "about": "https://gwern.net/about", "feed": "https://gwern.substack.com/feed", "keywords": "meta", "desc": "Personal website of Gwern Branwen (writer, self-experimenter, and programmer): topics: psychology, statistics, technology, deep learning, anime. This index page is a categorized list of Gwern.net pages.", "hn": [ { "created_at": "2019-01-21T12:08:15.000Z", "title": "On whether changes in bedroom CO2 levels affect sleep quality", "url": "https://www.gwern.net/zeo/CO2", "points": 576, "comments": 306, "id": "18959796" } ] } ] ``` 3. Add to [blogs.json](https://github.com/surprisetalk/blogs.hn/blob/main/blogs.json) in a pull request. To prevent merge conflicts, please don't append to the top or bottom!
Byxs20/PuzzleSolver
https://github.com/Byxs20/PuzzleSolver
一款针对CTF竞赛MISC的工具~
# PuzzleSolver 一款针对CTF竞赛MISC的工具~ # Q&A Q: ``` [*] cloacked-pixel执行失败! Traceback (most recent call last): File "core\ErrorCore\errorCore.py", line 13, in wrapper File "core\BruteForce\Cloacked_Pixel_Func\cloacked_pixel.py", line 75, in BruteForceLSB File "magic\magic.py", line 71, in __init__ File "magic\magic.py", line 272, in magic_load File "magic\magic.py", line 203, in errorcheck_negative_one magic.magic.MagicException: b'could not find any valid magic files!' ``` A: ``` 最简单的解决方案,换个目录就好了(原因是目录存在了中文) 可能有些朋友说我放在桌面为什么还出现这个问题?(因为你用户名是中文) 存放入C\D\E\F等盘都行,简而言之不要有中文! ``` # 捐赠 软件也花了我不少的时间,一开始只是为了方便自己,后面发现挺好用的还是分享给大家一起用!(为爱发电了属于是...) <img src="./images/wx.png" width=400> <img src="./images/alipay.jpg" width=400>
NUST-Machine-Intelligence-Laboratory/HVC
https://github.com/NUST-Machine-Intelligence-Laboratory/HVC
null
# HVC (Dynamic in Static: Hybrid Visual Correspondence for Self-Supervised Video Object Segmentation) >This repository is the official PyTorch implementation of the paper "**Dynamic in Static: Hybrid Visual Correspondence for Self-Supervised Video Object Segmentation**" ## Abstract >Conventional video object segmentation (VOS) methods usually necessitate a substantial volume of pixel-level annotated video data for fully supervised learning. In this paper, we present HVC, a **h**ybrid spatial-temporal **v**isual **c**orrespondence framework for self-supervised VOS setting. HVC extracts pseudo-dynamic signals from static images to achieve an economical and scalable VOS model. Our approach utilizes a minimalist fully-convolutional architecture to capture spatial-temporal visual correspondence in image-cropped views. To achieve this objective, we present a unified self-supervised approach to learn representations of spatial-temporal feature similarity. Firstly, we establish spatial correspondence by utilizing a priori coordinate information between cropped views to guide the formation of consistent spatial feature representations. Subsequently, we devise a concise convolutional layer to capture the forward / backward pseudo-dynamic flows between two views, serving as a cue for temporal representations. Finally, we propose a hybrid visual correspondence loss to learn joint spatial and temporal consistency representations. Our approach, without bells and whistles, necessitates only one training session from static image data, minimizing memory consumption (**16GB**) and training time (**2h**). Moreover, HVC achieves state-of-the-art performance in several self-supervised VOS benchmarks. ![HVC](assets/pipeline.png) ><center>The architeture of HVC.</center> ## Highlights - **Performance:** HVC achieves SOTA self-supervised results on video object segmentation, part propagation, and pose tracking. **video object segmentation** (J&F Mean): DAVIS16 val-set: **80.1** DAVIS17 val-set: **73.1** DAVIS17 dev-set: **61.7** YouTube-VOS 2018 val-set: **71.9** YouTube-VOS 2019 val-set: **71.6** **part propagation** (mIoU): VIP val-set: **44.6** **pose tracking** (PCK): JHMDB val-set: **61.7** [email protected]; **82.8** [email protected] - **Efficiency:** HVC necessitates only one training session from static image data (COCO), minimizing memory consumption (∼**16GB**) and training time (∼**2h**). - **Robustness**: HVC enables the same self-supervised VOS performance with static image datasets ([COCO](https://cocodataset.org/): **73.1**, [PASCAL VOC](http://host.robots.ox.ac.uk/pascal/VOC/): **72.0**, and [MSRA10k](https://mmcheng.net/msra10k/): **71.1**) as with the video dataset (e.g. [YouTube-VOS](https://youtube-vos.org/dataset/vos/): **73.1**). ## Requirements - python 3.9 - torch==1.12.1 - torchvision==0.13.1 - CUDA 11.3 Create a conda envvironment: ```bash conda create -n hvc python=3.9 -y conda activate hvc pip install torch==1.12.1+cu113 torchvision==0.13.1+cu113 --extra-index-url https://download.pytorch.org/whl/cu113 ``` ## Model Zoo and Results | Model | J&F Mean ↑ | Download | | :-----------------------------: | :----------: | :----------------------: | | HVC pre-trained on COCO | DAVIS17 val-set: 73.1 | [model](https://github.com/PGSmall/HVC/releases/download/v0.1/releases_models.zip) / [results](https://github.com/PGSmall/HVC/releases/download/v0.1/hvc_coco.zip) | | HVC pre-trained on PASCAL VOC | DAVIS17 val-set: 72.0 | [model](https://github.com/PGSmall/HVC/releases/download/v0.1/releases_models.zip) / [results](https://github.com/PGSmall/HVC/releases/download/v0.1/hvc_voc.zip) | | HVC pre-trained on MSRA10k | DAVIS17 val-set: 71.1 | [model](https://github.com/PGSmall/HVC/releases/download/v0.1/releases_models.zip) / [results](https://github.com/PGSmall/HVC/releases/download/v0.1/hvc_msra.zip) | | HVC pre-trained on YouTube-VOS | DAVIS17 val-set: 73.1 | [model](https://github.com/PGSmall/HVC/releases/download/v0.1/releases_models.zip) / [results](https://github.com/PGSmall/HVC/releases/download/v0.1/hvc_ytb.zip) | ``Note:`` HVC requires only one training session to infer all test datasets for VOS. ## Dataset Preparation For the static image dataset: 1. Donwload the COCO 2017 train-set from the [COCO website](https://cocodataset.org/). 2. Please ensure the datasets are organized as following format. ``` data |--filelists |--Static |--JPEGImages |--COCO |--MSRA10K |--PASCAL ``` For the video dataset: 1. Download the DAVIS 2017 val-set from the [DAVIS website](https://davischallenge.org/), the direct download [link](https://data.vision.ee.ethz.ch/csergi/share/davis/DAVIS-2017-trainval-480p.zip). 2. Download the full YouTube-VOS dataset (version 2018) from the [YouTube-VOS website](https://youtube-vos.org/dataset/vos/), the direct download [link](https://drive.google.com/drive/folders/1bI5J1H3mxsIGo7Kp-pPZU8i6rnykOw7f?usp=sharing). Please move ``ytvos.csv`` from ``data/`` to the path ``data/YTB/2018/train_all_frames``. 3. Please ensure the datasets are organized as following format. ``` data |--filelists |--DAVIS |--Annotations |--ImageSets |--JPEGImages |--YTB |--2018 |--train_all_frames |--ytvos.csv |--valid_all_frames ``` ``Note:`` Please prepare the following datasets if you want to test the DAVIS dev-set and YouTube-VOS val-set (2019 version). Download link: [DAVIS dev-set](https://data.vision.ee.ethz.ch/csergi/share/davis/DAVIS-2017-test-dev-480p.zip), [YTB 2019 val-set](https://drive.google.com/drive/folders/1BWzrCWyPEmBEKm0lOHe5KLuBuQxUSwqz?usp=sharing). ## Training ```bash # pre-train on COCO bash ./scripts/run_train_img.sh ``` Or ```bash # pre-train on YouTube-VOS bash ./scripts/run_train_vid.sh ``` ## Testing - Download [MoCo V1](https://dl.fbaipublicfiles.com/moco/moco_checkpoints/moco_v1_200ep/moco_v1_200ep_pretrain.pth.tar), and put it in the folder ``checkpoints/``. - Download [HVC](https://github.com/PGSmall/HVC/releases/download/v0.1/releases_models.zip) and unzip them into the folder ``checkpoints/``. ```bash # DAVIS 17 val-set bash ./scripts/run_test.sh hvc davis17 bash ./scripts/run_metrics hvc davis17 ``` ```bash # YouTube-VOS val-set bash ./scripts/run_test.sh hvc ytvos # Please use the official YouTube-VOS server to calculate scores. ``` ```bash # DAVIS 17 dev-set bash ./scripts/run_test.sh hvc davis17dev # Please use the official DAVIS server to calculate scores. ``` ``Note:`` YouTube-VOS servers ([2018 server](https://codalab.lisn.upsaclay.fr/competitions/7685) and [2019 server](https://codalab.lisn.upsaclay.fr/competitions/6066)); DAVIS server ([2017 dev-set](https://codalab.lisn.upsaclay.fr/competitions/6812)). ## Qualitative Results ### Sequence Results: >![HVC](assets/heat_map.png) ><center>Learned representation visualization from HVC without any supervision.</center> >![HVC](assets/VOS.png) ><center>Qualitative results for video object segmentation.</center> >![HVC](assets/MP.png) ><center>Qualitative results of the proposed method for (a) body part propagation and (b) human pose tracking .</center> ### Video Results: <center>DAVIS 2017 val-set</center> ![HVC](assets/bmx-trees.gif) | ![HVC](assets/india.gif) ---|--- <center>YouTube-VOS val-set</center> ![HVC](assets/06a5dfb511.gif) | ![HVC](assets/f1ccd08a3d.gif) ---|--- <center>DAVIS 2017 dev-set</center> ![HVC](assets/girl-dog.gif) | ![HVC](assets/tandem.gif) ---|--- ## Acknowledgements - We thank [PyTorch](https://pytorch.org/), [YouTube-VOS](https://youtube-vos.org/), and [DAVIS](https://davischallenge.org/) contributors. - Thanks to [videowalk](https://github.com/ajabri/videowalk) for the label propagation codebases.
seifrajhi/awesome-cilium
https://github.com/seifrajhi/awesome-cilium
A curated list of awesome projects related to cilium
# Awesome Cilium [![Awesome](https://awesome.re/badge.svg)](https://github.com/sindresorhus/awesome) > A curated list of awesome projects related to Cilium. Cilium is an open-source networking project that provides networking and security capabilities for containerized applications, microservices, and virtual machines. Recently [Cilium](https://cilium.io) launched a great website about eBPF called [ebpf.io](https://ebpf.io/). It serves a similar purpose to this list, with [an introduction to eBPF](https://ebpf.io/what-is-ebpf). > Note: Cilium is an exciting piece of technology, and its ecosystem is constantly evolving. We'd love help from _you_ to keep this awesome list up to date, and improve its signal-to-noise ratio in anyway we can. Please feel free to leave [any feedback](https://github.com/seifrajhi/awesome-cilium/issues). Here are some notable projects and concepts related to Cilium: ## Contents - [Reference Documentation](#reference-documentation) - [Articles and Presentations](#articles-and-presentations) - [Community Events](community-events) - [Community and Contributing](#community-and-contributing) - [Hands on Contents](#hands-on-contents) ## Reference Documentation - [Official website](https://cilium.io/) of cilium which was originally created by [Isovalent](https://isovalent.com/) - [Official github repository](https://github.com/cilium): of cilium project. - [A cookbook of installing cilium on AWS EKS](https://github.com/littlejo/cilium-eks-cookbook): Multiple ways to install cilium in eks ### Cilium related projects - [Cilium](https://github.com/cilium/cilium): The core project, Cilium, is a networking plugin for various container runtimes such as Kubernetes, Docker, and Mesos. It leverages Linux kernel features like eBPF (extended Berkeley Packet Filter) to provide fast and secure networking and load balancing for applications. - [eBPF](https://github.com/cilium/ebpf): Extended Berkeley Packet Filter (eBPF) is a technology that allows for dynamic, programmable packet filtering and network analysis in the Linux kernel. Cilium heavily relies on eBPF to enhance the networking and security capabilities of containerized applications. - [Cilium Proxy](https://github.com/cilium/proxy): Cilium Proxy is a high-performance HTTP, TCP, and gRPC proxy that can be automatically injected into Kubernetes pods. It provides features like load balancing, health checking, and L7 (Layer 7) visibility, helping secure and monitor application traffic. - [Cilium Cluster Mesh](https://docs.cilium.io/en/v1.9/gettingstarted/clustermesh/): Cilium Cluster Mesh allows you to securely connect multiple Kubernetes clusters together using encrypted tunnels. It enables seamless communication and service discovery across clusters while maintaining strong security boundaries. - [Hubble](https://github.com/cilium/hubble): Hubble is a network visibility and monitoring tool built by the Cilium community. It provides real-time visibility into network traffic, allowing operators to gain insights into application behavior, troubleshoot connectivity issues, and enforce network security policies. - [Cilium Operator](https://docs.cilium.io/en/stable/internals/cilium_operator/): Cilium Operator is a Kubernetes operator that simplifies the deployment and management of Cilium within a Kubernetes cluster. It automates tasks such as deploying Cilium agents, configuring eBPF policies, and handling upgrades. - [Tetragon](https://github.com/cilium/tetragon): Tetragon is a runtime security enforcement and observability tool. - [Cilium Mesh](https://isovalent.com/blog/post/introducing-cilium-mesh/): Cilium Mesh connects Kubernetes workloads, virtual machines, and physical servers running in the cloud, on-premises, or at the edge. - [NetworkPolicy Editor](https://editor.networkpolicy.io/): Create, Visualize, and Share Kubernetes NetworkPolicies. - [Prometheus & Grafana for Cilium](https://github.com/cilium/cilium/tree/main/examples/kubernetes/addons/prometheus): Collect metrics from Cilium and store them in Prometheus for analysis and alerting. - [Cilium Helm Chart](https://artifacthub.io/packages/helm/cilium/cilium): Helm chart that can be used to deploy Cilium on Kubernetes. - [Hubble adaptor for OpenTelemetry](https://github.com/cilium/hubble-otel): Enable exporting Hubble flow data using OpenTelemetry collector. - [Bpfilter instead of iptables for routing](https://www.admin-magazine.com/Archive/2019/50/Bpfilter-offers-a-new-approach-to-packet-filtering-in-Linux): Bpfilter offers a new approach to packet filtering in Linux. ![image](https://github.com/seifrajhi/awesome-cilium/assets/26981510/b2236520-ea4c-400d-a5fd-15850a8bf420) - [Inter-node traffic control](https://docs.cilium.io/en/latest/network/kubernetes/policy/#ciliumclusterwidenetworkpolicy): Policies that are applicable to the whole cluster (non-namespaced) and provide you with the means to specify nodes as the source and target. - [BPF and XDP Reference Guide](http://docs.cilium.io/en/latest/bpf/): Guide from the Cilium project. - [Why is the kernel community replacing iptables with BPF?](https://cilium.io/blog/2018/04/17/why-is-the-kernel-community-replacing-iptables/): A blog post by Cilium on the the motivations behind eBPF and bpfilter, with a couple examples and links to other projects using eBPF and bpfilter. - [Bpfilter: Linux firewall with eBPF sauce](https://qmo.fr/docs/talk_20180316_frnog_bpfilter.pdf): Slides from a talk by Quentin Monnet with a background on eBPF and comparing bpfilter to iptables. - [Cilium: Networking & Security for Containers with BPF & XDP](http://www.slideshare.net/ThomasGraf5/clium-container-networking-with-bpf-xdp): Also featuring a load balancer use case - [Cilium: Networking & Security for Containers with BPF & XDP](http://www.slideshare.net/Docker/cilium-bpf-xdp-for-containers-66969823) - [video](https://www.youtube.com/watch?v=TnJF7ht3ZYc&list=PLkA60AVN3hh8oPas3cq2VA9xB7WazcIgs) - [Cilium: Fast IPv6 container Networking with BPF and XDP](http://www.slideshare.net/ThomasGraf5/cilium-fast-ipv6-container-networking-with-bpf-and-xdp) - [Cilium: BPF & XDP for containers](https://fosdem.org/2017/schedule/event/cilium/). - [Learning ebpf book](https://github.com/lizrice/learning-ebpf): Learning eBPF, published by O'Reilly! Here's where you will find a VM config for the examples. ## Articles and Presentations - [Introduction to Cilium](https://www.youtube.com/watch?v=80OYrzS1dCA): A livestream covering all things related to eBPF and Cilium presented by Isovalent's Thomas Graf & Liz Rice. - [Cilium CNI](https://medium.com/itnext/cilium-cni-a-comprehensive-deep-dive-guide-for-networking-and-security-enthusiasts-588afbf72d5c): A Comprehensive Deep Dive Guide for Networking and Security Enthusiasts. - [Cilium for Kubernetes networking](https://blog.palark.com/why-cilium-for-kubernetes-networking/): Why we use it and why we love it. - [A generic introduction to Cilium](https://opensource.googleblog.com/2016/11/cilium-networking-and-security.html) - [A podcast interviewing Thomas Graf](http://blog.ipspace.net/2016/10/fast-linux-packet-forwarding-with.html) - Ivan Pepelnjak interviewing Thomas, October 2016, on eBPF, P4, XDP and Cilium. - [How eBPF Streamlines the Service Mesh](https://thenewstack.io/how-ebpf-streamlines-the-service-mesh/): Explore how eBPF allows us to streamline the service mesh, making the data plane more efficient and easier to deploy. - [From Amazon VPC CNI to Cilium with Zero Downtime](https://medium.com/codex/migrate-to-cilium-from-amazon-vpc-cni-with-zero-downtime-493827c6b45e): Migrate to Cilium from Amazon VPC CNI with Zero Downtime - [Cilium CNI and OKE on Oracle Cloud](https://medium.com/oracledevs/cni-adventures-with-kubernetes-on-oracle-cloud-cilium-5c6f011746d5): Kubernetes Networking with Cilium CNI and OKE on Oracle Cloud - [Cilium in Azure Kubernetes Service (AKS)](https://learn.microsoft.com/en-us/azure/aks/azure-cni-powered-by-cilium): Configure Azure CNI Powered by Cilium in Azure Kubernetes Service (AKS) - [eCHO News NEWSLETTER](https://www.linkedin.com/newsletters/echo-news-6937495018668482560/): eCHO news in a bi-weekly wrap up of all things eBPF and Cilium. - [Exploring eBPF and XDP](https://naftalyava.com/example-xdp-ebpf-code-for-handling-ingress-traffic/): An Example - A basic example how to get started with XDP - [eBPF - Rethinking the Linux Kernel](https://docs.google.com/presentation/d/1AcB4x7JCWET0ysDr0gsX-EIdQSTyBtmi6OAW7bE0jm0/edit#slide=id.g6e43ab8f8d_0_612): eBPF JavaScript-like capabilities to the Linux Kernel ## Community Events - [CiliumCon](https://cilium.io/events/): CiliumCon is a full-day co-located event for Cilium users, contributors, and new community members. - [Isovalent Security Summer School 2023](https://isovalent.com/events/2023-07-security-summer-school/): Virtual Security Summer School where you can level up your skills with hands-on labs. Learn how Cilium, Tetragon, and Hubble help improve Kubernetes security. - [Isovalent's cilium related events](https://isovalent.com/events/): events featuring diverse voices, innovative companies and big ideas ## Community and Contributing - [Slack channel](https://cilium.herokuapp.com/): For live conversation and quick questions, join the Cilium Slack workspace - [Twitter](https://twitter.com/ciliumproject): Follow Cilium on Twitter for the latest news and announcements. - [YouTube](https://www.youtube.com/c/eBPFCiliumCommunity): Watch the videos from the Cilium and eBPF Communities. - [Contributors](https://github.com/cilium/cilium/graphs/contributors): Contributions to main. ## Hands on Contents - [Isovalent library for Cilium](https://isovalent.com/resource-library/): You can find Videos `Case studies`, `Blogs`, `Books`, `Labs` and `Analyst Reports`. - [Cilium Learning Tracks](https://isovalent.com/learning-tracks/) : Pick from tracks for cloud network engineers, security professionals, platform engineers, platform ops (service mesh), and cloud architects ## Contributing Contributions welcome! Read the [contribution guidelines](contributing.md) first. ## License [![CC0](http://mirrors.creativecommons.org/presskit/buttons/88x31/svg/cc-zero.svg)](http://creativecommons.org/publicdomain/zero/1.0) To the extent possible under law, rajhiseif has waived all copyright and related or neighboring rights to this work.
verytinydever/AGE_Application
https://github.com/verytinydever/AGE_Application
null
REAMD ME
Shirley-WeiVi/ShirleyAi-NlpCorpusViews
https://github.com/Shirley-WeiVi/ShirleyAi-NlpCorpusViews
ShirleyAi专用NLP语料视图工具,它可以用于编辑、管理、导出不同格式或更新语料库。
# ShirleyAi-专用语料管理工具 ### 使用提要 该应用属于手动标注工具,并不能批量导入,ShirleyAi的目的是把我女朋友ai化,和女朋友聊天总有难以解释的语句或语法,该工具的目的是收集管理和女朋友的聊天对话,通过手动录入对话片段上下文,通过接口导出训练数据集。 如果你发现了有提取微信聊天记录的方式,不妨告诉我,我可以在里面内置提取入口, qq群: 1130531420 演示视频: https://www.bilibili.com/video/BV1Wk4y1P76W/ 新功能演示: https://www.bilibili.com/video/BV1U14y1R7pQ ### 功能说明 1.管理对话(增删查改)<br> 2.导出数据集接口,可二次开发定义自己的数据集导出类型<br> 3.基于数据库储存<br> 4.新增音频管理功能 5.新增表情包发送 6.新增图片发送 ### 现有功能设计 产品原型: https://mastergo.com/goto/s0QKpitj?page_id=M&file=98653253651947 邀请您进入《MainUi》,点击链接开始协作 ### 开发说明 采用python开发基于flask框架作为功能接口,采用sqlachemy操作数据库,通过vue搭建的界面进行数据管理
Prajwal0225/Booking_Application
https://github.com/Prajwal0225/Booking_Application
null
# Booking_Application ![image](https://github.com/Prajwal0225/Booking_Application/assets/103810119/dd3cf42f-c295-4ae3-80df-d979a2f5a849) --- ## Usage - Register: Users can create an account by providing their name, email, and password. Passwords are securely encrypted. ![Register page](https://github.com/Prajwal0225/Booking_Application/assets/103810119/ba07ddc3-1303-4a1b-8258-db69e7c2c351) ![register succesfully](https://github.com/Prajwal0225/Booking_Application/assets/103810119/6fef02d5-eb1e-45be-98db-78543596964d) --- - Login: Existing users can log in with their email and password. ![log in page](https://github.com/Prajwal0225/Booking_Application/assets/103810119/97c764f0-493d-4ff8-93f5-28a2175080a3) ![log in succesfully](https://github.com/Prajwal0225/Booking_Application/assets/103810119/8ced08bf-ba56-47d7-98fe-324b632ed610) --- - View Available Places: Users can browse the list of available accommodations, including title, address, description, price, and photos. ![homepage 1](https://github.com/Prajwal0225/Booking_Application/assets/103810119/851a3235-f072-480c-8bd9-d1fda429d188) ![hom page 2](https://github.com/Prajwal0225/Booking_Application/assets/103810119/e67abeb7-9c22-45a9-ac64-069d2a14830b) --- - More Photos: If you like someplace and wanted to see more photos then you can do it by clicking ![image](https://github.com/Prajwal0225/Booking_Application/assets/103810119/f2fdd529-68a8-47e4-944d-48a7b2876041) https://github.com/Prajwal0225/Booking_Application/assets/103810119/9158ba3b-0a39-4729-add2-7e8b6fc2c4be --- - View Single Places: Users can see any available accommodation, including title, address, description, price, and photos. ![seprate page 1](https://github.com/Prajwal0225/Booking_Application/assets/103810119/fd134fa1-ba4e-46cd-b06e-a7f6040c645a) ![seprate page 2](https://github.com/Prajwal0225/Booking_Application/assets/103810119/20fa5da2-b62a-4a4c-8276-2e9a7c918b96) --- - Bookings: Users can make bookings by selecting a place, specifying check-in and check-out dates, number of guests, and providing their contact information. ![image](https://github.com/Prajwal0225/Booking_Application/assets/103810119/794068e1-e809-4220-afda-3b9ab4a29e7d) ![image](https://github.com/Prajwal0225/Booking_Application/assets/103810119/8cb3c054-867a-43d3-ba19-8929c11ca186) --- - View Bookings: Users can see their previous bookings and their details. ![image](https://github.com/Prajwal0225/Booking_Application/assets/103810119/02b906fe-364c-4a09-ba1a-b7e8ede110e9) --- - Manage Profile: Users can log out. - ![image](https://github.com/Prajwal0225/Booking_Application/assets/103810119/21aeb994-64b3-4030-89a7-a7c01b3cee4f) --- - Add Your Accommodation: Users can add the details of their own accommodations, including title, address, description, price, check-in/check-out times, maximum number of guests, and additional information. If they wanted their Accommodation to rent out. ![image](https://github.com/Prajwal0225/Booking_Application/assets/103810119/ebd09904-eda6-47d9-bfe7-2a68b5f819a3) Click on add a new place to add your place ![add new places](https://github.com/Prajwal0225/Booking_Application/assets/103810119/d538e5ea-8be2-49dc-ac59-235f6c0e0730) Form to fill up information - Photo Upload: Users can upload photos for their places, either by selecting files from their device or providing a URL for the photo Url: ![add a photo by link](https://github.com/Prajwal0225/Booking_Application/assets/103810119/cf78602d-958a-4398-b18c-0958dc6bb82b) ![photo added](https://github.com/Prajwal0225/Booking_Application/assets/103810119/dfc556c1-4691-4bfe-9ae4-723a8cf5b3bb) From Device ![upload form device](https://github.com/Prajwal0225/Booking_Application/assets/103810119/4da619c8-2fca-4587-a0fe-96bc2c4cc2cb) ![selecting photo from device](https://github.com/Prajwal0225/Booking_Application/assets/103810119/6cd24039-442e-44d4-89b3-478e29d30a33) ![phot added by upload](https://github.com/Prajwal0225/Booking_Application/assets/103810119/96453055-dda4-4eae-a88b-1533954e0550) You can also delete and select a photo as a favorite that photo should be displayed on your front page and your accommodation section You can select the star to make it your favorite photo and if you want to delete click on the delete button ![image](https://github.com/Prajwal0225/Booking_Application/assets/103810119/fa71fe43-21de-419b-a286-605aac5a68e7) Other information ![image](https://github.com/Prajwal0225/Booking_Application/assets/103810119/80c15967-23c4-4015-91a7-35eb2b297823) ![added to my accomoodation](https://github.com/Prajwal0225/Booking_Application/assets/103810119/f6cec9b2-19ff-45ac-bb4e-1962df25d695) - Edit Accommodation Details: Users can edit the details of their own accommodations, including title, address, description, price, check-in/check-out times, the maximum number of guests, and additional information. ![edit the accomondations](https://github.com/Prajwal0225/Booking_Application/assets/103810119/c9a591c3-b8de-4128-a54b-bff07736be5e) ![updated](https://github.com/Prajwal0225/Booking_Application/assets/103810119/d03209b1-0426-4daf-9d20-0668901b82f0) - Database: ![image](https://github.com/Prajwal0225/Booking_Application/assets/103810119/ca128d8e-f981-4b7f-901e-60a0162ba3b6) ## Features - User Registration and Authentication: Users can create accounts and log in securely with encrypted passwords. Password is encrypted using bcrypt ![image](https://github.com/Prajwal0225/Booking_Application/assets/103810119/4f776e4d-7a1c-4f28-a6af-75495df6e7de) - Place Listings: Users can view a list of available accommodations with details such as title, address, description, price, and photos. - Booking Management: Users can make bookings for desired places, specify check-in/check-out dates, number of guests, and view their booking details. - Profile Management: Users can view and update their profile information, including name, email, and contact details. - File Upload: Users can upload photos for their places or their profile picture, either by selecting files from their device or providing a URL for the photo. - Edit Accommodation Details: Users can edit the details of their own accommodations, including title, address, description, price, check-in/check-out times, the maximum number of guests, and additional information. ## Technologies Used - **Front-end**: React, React Router DOM, Axios - **Back-end**: Node.js, Express.js - **Database**: MongoDB, Mongoose - **Authentication**: JSON Web Tokens (JWT), bcrypt.js - **File Upload**: Multer - **HTTP Requests**: Axios - **Styling**: CSS
lemonlove7/EHole_magic
https://github.com/lemonlove7/EHole_magic
EHole(棱洞)魔改。可对识别出来的重点资产进行漏洞检测(支持从hunter和fofa中提取资产)支持对ftp服务识别及爆破
# EHole_magic 在进行web打点时,信息收集中对目标进行指纹识别是必不可少的一个环节,使用EHole识别出来的重点资产还要用其他漏洞利用工具去检测,非常的不方便,在原本的基础上加了个漏洞检测功能。提打点的效率。(不影响原版功能的使用) # 运行流程 对资产进行指纹识别-->重点资产进行漏洞检测-->如:http://www.xxx.com存在通达OA--> 对通达OA漏洞进行扫描->对ftp服务进行暴力破解 <img width="679" alt="image" src="https://github.com/lemonlove7/EHole_magic/assets/56328995/86acca48-90af-4a47-b197-7e4ef5df8a4b"> # 使用 默认不开启,在poc.ini中将poc=no改为poc=yes开启 ftp爆破默认不开启,在poc.ini中将brute=no改为brute=yes开启 ### fofa识别 注意:从FOFA识别需要配置FOFA 密钥以及邮箱,在config.ini内配置好密钥以及邮箱即可使用。 搜索无结果 解决方案:如:将domain="baidu.com"改为domain='"baidu.com"' ``` ehole finger -s domain="baidu.com" // 支持所有fofa语法 ``` ### hunter识别 注意:从hunter识别需要配置hunter 密钥,在config.ini内配置好密钥即可使用。 搜索无结果 与fofa解决方案相同 ``` ehole finger -b ip="180.140.20.182" // 支持所有hunter语法 ``` ### 本地识别 ``` ehole finger -l 1.txt // 从文件中加载url扫描 ``` ### 单个目标识别 ``` ehole finger -u http://www.baidu.com // 单个url检测 ``` # 使用截图 <img width="930" alt="image" src="https://github.com/lemonlove7/EHole_magic/assets/56328995/e4064f38-6458-4778-a2f5-b7db2de54b1d"> # 更新日志 ### 2023年7月5日 1、修复mac版报错segmentation fault 2、增加从hunter提取资产,进行指纹识别 <img width="892" alt="image" src="https://github.com/lemonlove7/EHole_magic/assets/56328995/f2c733d8-6bf6-4d56-b31b-61015421fd7c"> <img width="807" alt="image" src="https://github.com/lemonlove7/EHole_magic/assets/56328995/dfb7b200-4938-41b1-88c3-9756c6970ebc"> ### 2023年7月7日 1、增加常用poc数量 2、对ftp服务进行检测,将poc.ini中的brute设置为yes可进行密码爆破 <img width="536" alt="image" src="https://github.com/lemonlove7/EHole_magic/assets/56328995/d28ca59b-3eab-4ef8-a19a-d8726ed274f2"> <img width="672" alt="image" src="https://github.com/lemonlove7/EHole_magic/assets/56328995/2fe41884-dd51-48ce-8651-b20e3a4a8139"> ### 2023年7月13日 1、更新指纹数量,1w+ 2、修复fofa -f或者-s 后没有具体资产显示 3、增加宏景eHR人力资源信息管理系统sql注入检测 <img width="1408" alt="image" src="https://github.com/lemonlove7/EHole_magic/assets/56328995/98c7590d-0dce-4b10-a329-5db313c5e484"> <img width="1132" alt="image" src="https://github.com/lemonlove7/EHole_magic/assets/56328995/78af3d01-7c23-4058-9071-402e07dbea6a"> ### 2023年7月26日 1、更新最近出的day(用友、金蝶云、大华) 2、优化指纹识别 ## 参考优秀项目 POC-bomber:https://github.com/tr0uble-mAker/POC-bomber peiqi文库:https://peiqi.wgpsec.org EHole:https://github.com/EdgeSecurityTeam/EHole
genabdulrehman/coffee-app
https://github.com/genabdulrehman/coffee-app
null
# coffe_app A new Flutter project. ## Project Overview https://github.com/genabdulrehman/coffee-app/assets/76816147/c3f97a22-6cc3-43a1-b51e-8d42e73949f1
mkazemiraz/one-line-javascript-utility-codes
https://github.com/mkazemiraz/one-line-javascript-utility-codes
✅ A collection of useful one-line JavaScript utility codes.
# One-line JavaScript Utility Codes ![One-line JavaScript Utility Codes](./assets/cover.png) [![Open Source Helpers](https://www.codetriage.com/mkazemiraz/one-line-javascript-utility-codes/badges/users.svg)](https://www.codetriage.com/mkazemiraz/one-line-javascript-utility-codes) If you like this content, you can ping me or follow me on Twitter 👊 [![Twitter URL](https://img.shields.io/twitter/url?url=https%3A%2F%2Ftwitter.com%2Fmkazemiraz&style=social&logo=twitter&label=Follow%20%40mkazemiraz)](https://twitter.com/mkazemiraz) ## Introduction 😍 Welcome to the One-Line JavaScript Utility Codes repository! Here, we aim to provide you with a curated collection of powerful and concise JavaScript utility functions that can be written in just one line of code. ### Motivation In the world of software development, efficiency and productivity are crucial factors for success. As developers, we often find ourselves performing repetitive tasks or writing lengthy code to accomplish simple operations. This is where one-line utility codes come to the rescue! The primary motivation behind this repository is to empower developers like you with a toolbox of concise and reusable utility functions that can significantly simplify your code and boost your productivity. With these one-liners, you'll be able to accomplish common tasks quickly and elegantly, saving you valuable time and effort. By leveraging the expressive nature of JavaScript, we have carefully crafted these utility functions to encapsulate complex logic into a single line, without sacrificing readability or performance. Each utility code is designed to be self-contained, meaning you can easily copy and paste it into your projects without any dependencies. Whether you're a beginner learning JavaScript or an experienced developer seeking to optimize your workflow, this repository has something for everyone. We cover a wide range of use cases, including string manipulation, array operations, mathematical calculations, date and time handling, object manipulation, and much more. Furthermore, we encourage collaboration and community participation. If you have your own powerful one-liners that you'd like to share with the world, we invite you to contribute to this repository. Together, we can build a comprehensive library of JavaScript utility codes that will benefit developers worldwide. So go ahead, explore the repository, and unlock the potential of these one-line utility codes. Simplify your code, increase your productivity, and take your JavaScript skills to the next level. Let's make coding simpler, one line at a time! 🤟 ## Table of Contents - [Copy text to clipboard](#copy-text-to-clipboard) - [Shuffle an array](#shuffle-an-array) - [Convert rgba color code to hexadecimal](#convert-rgba-color-code-to-hexadecimal) - [Convert hexadecimal color code to rgb](#convert-hexadecimal-color-code-to-rgb) - [Calculates the average value of a list of number](#calculates-the-average-value-of-a-list-of-number) - [Check a number is even or odd](#check-a-number-is-even-or-odd) - [Remove duplicate elements from array](#remove-duplicate-elements-from-array) - [Check an object is empty](#check-an-object-is-empty) - [Reverse a string](#reverse-a-string) - [Calculate the number of days between two dates](#calculate-the-number-of-days-between-two-dates) - [Capitalize text](#capitalize-text) - [Generate a random string](#generate-a-random-string) - [Generate a random number between two numbers](#generate-a-random-number-between-two-numbers) - [Clear all cookies](#clear-all-cookies) - [Scroll to top](#scroll-to-top) - [Generate a random boolean](#generate-a-random-boolean) - [Identify the type of a data](#identify-the-type-of-a-data) - [Calculate percent]("#calculate-percent) - [Get a random item of array](#get-a-random-item-of-array) - [Sort array of objects based on the values of the specified key](#sort-array-of-objects-based-on-the-values-of-the-specified-key) - [Check if arrays / objects are equal](#check-if-arrays--objects-are-equal) - [Add a new item to the array at the specified position](#add-a-new-item-to-the-array-at-the-specified-position) - [Add commas to number](#add-commas-to-number) - [Flatten a nested array](#flatten-a-nested-array) - [Check if a Number is Prime](#check-if-a-number-is-prime) - [Check if a string is a palindrome](#check-if-a-string-is-a-palindrome) ### Copy text to clipboard ```javascript const copyToClipboard = (text) => addEventListener( "click", (_) => navigator.clipboard.writeText(text).catch(console.warn), { once: true } ); ``` #### Code Explanation The code snippet provided defines a JavaScript function called `copyToClipboard` that takes a text parameter. The purpose of this function is to copy the specified `text` to the user's clipboard. The function utilizes the `navigator.clipboard.writeText()` method, which is a built-in browser API for writing text to the clipboard. This method takes the `text` parameter as input and attempts to write it to the clipboard and returns a `Promise`. We chain `Promise.prototype.catch()` to handle errors or exceptions. Asynchronous `clipboard` write and read requires the calling `document` to be focused (user activation). `addEventListener('click', handler, {once: true})` is used to copy text to the clipboard when the `window` of the calling `document` is clicked after the function is called. #### Usage ```javascript copyToClipboard("Hello, Mkazemiraz!"); ``` > Note that this function relies on the availability of the `navigator.clipboard` API, which is supported by most modern browsers. If the API is not available in the user's browser, this code may not function as intended, and appropriate fallback mechanisms should be considered. ### Shuffle an array ```javascript const shuffleArray = (array) => array.sort(() => Math.random() - 0.5); ``` #### Code Explanation The provided code defines a JavaScript function called `shuffleArray` that takes an array (`array`) as a parameter. The purpose of this function is to shuffle the elements of the array randomly. The `sort` method is applied to the `array`, and a comparator function is passed as an argument. The comparator function uses `Math.random()` to generate a random number between 0 and 1. By subtracting 0.5 from this random number, the comparator function returns a positive or negative value, randomly changing the order of the array elements during the sorting process. By leveraging the built-in `sort` method with a random comparator, this one-liner provides a concise way to shuffle an array in JavaScript. #### Usage ```javascript const myArray = [1, 2, 3, 4, 5]; shuffleArray(myArray); console.log(myArray); // Outputs a randomly shuffled version of the array ``` > Please note that this shuffling technique is not guaranteed to produce perfectly uniform or unbiased results, especially for large arrays. For more rigorous shuffling requirements, you may need to explore alternative algorithms or libraries. ### Convert rgba color code to hexadecimal ```javascript const rgbToHex = (r, g, b) => "#" + ((1 << 24) + (r << 16) + (g << 8) + b).toString(16).slice(1); ``` #### Code Explanation This code defines a function `rgbToHex` that takes three parameters (`r`, `g`, and `b`) representing the red, green, and blue components of an RGB color code, respectively. It converts the RGB color code to its corresponding hexadecimal representation. The code works by performing bitwise operations and string manipulation to convert the RGB components to hexadecimal values. Here's how it works: 1. The expression `(1 << 24) + (r << 16) + (g << 8) + b` combines the RGB components into a single integer value. Bitwise shift operators (`<<`) are used to position each component at the appropriate bit position: red at 16 bits, green at 8 bits, and blue at 0 bits. The `<< 24` is used to ensure that the highest 8 bits are set to 1, as a placeholder for the alpha component (not included in this code). 2. The resulting integer value is converted to a hexadecimal string using `.toString(16)`. The `16` parameter specifies the radix for the conversion, which represents the base of the numeral system (in this case, hexadecimal). 3. The `.slice(1)` method is applied to remove the leading "1" from the hexadecimal string. This is necessary because the bitwise operation (`1 << 24`) sets the highest bit to 1, which adds an extra digit in the hexadecimal representation. 4. Finally, the "#" symbol is concatenated with the resulting hexadecimal string to form the complete hexadecimal color representation. #### Usage ```javascript const hexColor = rgbToHex(255, 0, 128); console.log(hexColor); // Outputs "#ff0080" ``` > Please note that this code assumes valid input values for the RGB components (`r`, `g`, and `b`) within the range of 0 to 255. If you have different requirements or need to handle edge cases, you may need to modify the code accordingly. ### Convert hexadecimal color code to rgb ```javascript const hexToRgba = (hex) => { const [r, g, b] = hex.match(/\w\w/g).map((val) => parseInt(val, 16)); return `rgba(${r}, ${g}, ${b})`; }; ``` #### Code Explanation This code defines a function `hexToRgba` that takes a hexadecimal color code as a parameter (`hex`). It converts the hexadecimal color code to its corresponding RGBA representation. Here's a breakdown of how the code works: 1. The code uses the `match` method on the `hex` string to extract pairs of two characters (`/\w\w/g`). Each pair represents the red, green, and blue components of the color in hexadecimal format. 2. The `map` method is then applied to the extracted pairs of characters. Within the `map` callback function, `parseInt(val, 16)` is used to convert each pair of characters from hexadecimal to decimal. 3. The resulting decimal values for the red, green, and blue components are destructured into the variables `r`, `g`, and `b`, respectively. 4. Finally, the function returns a string representing the RGBA color value using the interpolated values of `r`, `g`, and `b`. #### Usage ```javascript const hexColor = "#ff0080"; const rgbaColor = hexToRgba(hexColor); console.log(rgbaColor); // Outputs "rgba(255, 0, 128)" ``` ### Calculates the average value of a list of number ```javascript const average = (...args) => args.reduce((a, b) => a + b, 0) / args.length; ``` #### Code Explanation This code defines a function `average` that calculates the average value of a list of numbers. The function takes any number of arguments using the rest parameter syntax (`...args`), allowing you to pass in a variable number of arguments. Here's a breakdown of how the code works: 1. The `reduce` method is called on the `args` array, starting with an initial value of `0`. This method iterates over each element in the array and accumulates the sum of all the elements. 2. The callback function `(a, b) => a + b` is used in the `reduce` method. It takes two parameters `a` and `b`, representing the accumulated value and the current element, respectively. The sum of `a` and `b` is computed for each iteration. 3. After the `reduce` operation completes, the sum of all the elements in `args` is divided by the length of the `args` array using the `/` operator. This calculates the average value. 4. The computed average value is then returned by the `average` function. #### Usage ```javascript const result = average(4, 8, 12, 16); console.log(result); // Outputs 10 ``` > The code provides a concise way to calculate the average of a set of numbers, making it convenient for scenarios where you need to compute the average dynamically with varying numbers of input values. ### Check a number is even or odd ```javascript const isOdd = (num) => num % 2 !== 0; ``` #### Code Explanation This code defines a function `isOdd` that takes a number (`num`) as a parameter. It determines whether the given number is odd or not by performing a simple check. Here's an explanation of how the code works: 1. The code uses the modulo operator (`%`) to divide the `num` by `2`. The modulo operation calculates the remainder when `num` is divided by `2`. 2. The result of the modulo operation is compared to `0` using the strict inequality operator (`!==`). If the result is not equal to `0`, it means that the number is odd because odd numbers leave a remainder of `1` when divided by `2`. 3. The function returns a boolean value (`true` or `false`) based on the result of the comparison. If the number is odd, the expression `num % 2 !== 0` evaluates to `true`, indicating that the number is odd. Otherwise, if the number is even, the expression evaluates to `false`. #### Usage ```javascript console.log(isOdd(5)); // Outputs true console.log(isOdd(10)); // Outputs false ``` ### Remove duplicate elements from array ```javascript const uniqueArray = (arr) => [...new Set(arr)]; ``` #### Code Explanation This code defines a function `uniqueArray` that takes an array (`arr`) as a parameter. It returns a new array that contains only the unique elements from the input array, removing any duplicate values. Here's an explanation of how the code works: 1. The code uses the `Set` object to create a new set `new Set(arr)` from the input array. The `Set` object only allows unique values, automatically eliminating any duplicate elements. 2. The spread syntax (`...`) is used to convert the set back into an array. By spreading the elements of the set within a new array `[...new Set(arr)]`, a new array is created with only the unique elements. 3. The resulting array, containing only the unique elements from the input array, is then returned by the `uniqueArray` function. #### Usage ```javascript const numbers = [1, 2, 3, 4, 4, 3, 2, 1]; const uniqueNumbers = uniqueArray(numbers); console.log(uniqueNumbers); // Outputs [1, 2, 3, 4] ``` ### Check an object is empty ```javascript const isEmptyObject = (obj) => obj && Object.keys(obj).length === 0; ``` #### Code Explanation This code defines a function `isEmptyObject` that takes an object (`obj`) as a parameter. It checks if the object is empty by using two conditions. Here's an explanation of how the code works: 1. The `obj &&` condition checks if the object `obj` is truthy (not null or undefined). If `obj` is falsy, the condition short-circuits, and the function immediately returns `false`, indicating that the object is not empty. 2. The `Object.keys(obj).length === 0` condition checks if the number of own properties in the object is equal to `0`. `Object.keys(obj)` retrieves an array of all the enumerable own property names of the object, and `length === 0` checks if this array is empty. If the object has no own properties, this condition evaluates to `true`. The combination of these two conditions ensures that the function returns `true` only if the object is both truthy and has no own properties, indicating that it is empty. If any of the conditions is not met, the function returns `false`, indicating that the object is not empty. #### Usage ```javascript const emptyObj = {}; const nonEmptyObj = { key: "value" }; const nullObj = null; console.log(isEmptyObject(emptyObj)); // Outputs true console.log(isEmptyObject(nonEmptyObj)); // Outputs false console.log(isEmptyObject(nullObj)); // Outputs false ``` ### Reverse a string ```javascript const reverseStr = (str) => (str ?? "").split("").reverse().join(""); ``` #### Code Explanation This code defines a function called `reverseStr` that takes a string `str` as a parameter. It performs the following operations to reverse the string: 1. `str.split('')` splits the string into an array of individual characters. Each character becomes an element of the array. 2. `.reverse()` reverses the order of elements in the array, effectively reversing the order of characters in the string. 3. `.join('')` joins the elements of the array back into a string, with no separator between them. This results in the reversed string. #### Usage ```javascript console.log(reverseStr("Hello")); // Outputs 'olleH' ``` ### Calculate the number of days between two dates ```javascript const dayDiff = (d1, d2) => Math.ceil(Math.abs(d1.getTime() - d2.getTime()) / 86400000); ``` #### Code Explanation This code defines a function called `dayDiff` that calculates the number of days between two given dates (`d1` and `d2`). Here's a breakdown of how the code works: 1. `d1.getTime()` and `d2.getTime()` retrieve the numeric representation of the given dates in milliseconds since January 1, 1970 (Unix timestamp). 2. `Math.abs(d1.getTime() - d2.getTime())` calculates the absolute difference between the two timestamps, representing the duration in milliseconds. 3. The result of the difference is divided by `86400000`, which is the number of milliseconds in a day, to convert the duration from milliseconds to days. 4. `Math.ceil()` rounds up the calculated number of days to the nearest whole number, ensuring that partial days are rounded up. 5. The final result is returned by the arrow function. #### Usage ```javascript const date1 = new Date("2023-07-01"); const date2 = new Date("2023-07-05"); console.log(dayDiff(date1, date2)); // Outputs 4 ``` > Please note that this code assumes valid input values for `dayDiff` function. It may throw errors or unexpected results if invalid inputs are provided. ### Capitalize Text ```javascript const capitalize = (str) => str.charAt(0).toUpperCase() + str.slice(1); ``` #### Code Explanation This code defines a function called `capitalize` that capitalizes the first character of a given string `str`. Here's a breakdown of how the code works: 1. `str.charAt(0)` retrieves the first character of the string `str`. 2. `.toUpperCase()` converts the first character to uppercase. 3. `str.slice(1)` extracts the remaining characters of the string starting from the second character. 4. The capitalized first character and the remaining characters are concatenated using the `+` operator. 5. The final result is returned by the arrow function. #### Usage ```javascript console.log(capitalize("hello")); // Outputs 'Hello' ``` ### Generate a random string ```javascript const generateRandomString = (length) => [...Array(length)].map(() => Math.random().toString(36)[2]).join(""); ``` #### Code Explanation This code defines a function called `generateRandomString` that generates a random string of a specified `length`. Here's a breakdown of how the code works: 1. `[...Array(length)]` creates an array of length `length` with empty slots. 2. `.map(() => Math.random().toString(36)[2])` maps over each element of the array and generates a random string character by calling `Math.random().toString(36)[2]`. The `Math.random()` function generates a random decimal between 0 and 1, `toString(36)` converts it to a base-36 string, and `[2]` selects the third character of the string. 3. `.join('')` joins all the generated random string characters together into a single string. 4. The final result, which is the randomly generated string of the specified `length`, is returned by the arrow function. #### Usage ```javascript console.log(generateRandomString(8)); // Outputs a random string of length 8 ``` > Please note that this code assumes valid input values for `generateRandomString` function. It may throw errors or unexpected results if invalid inputs are provided. ### Generate a random number between two numbers ```javascript const random = (min, max) => Math.floor(Math.random() * (max - min + 1) + min); ``` #### Code Explanation This code defines a function called `random` that generates a random integer between the specified `min` and `max` values (inclusive). Here's a breakdown of how the code works: 1. `Math.random()` generates a random decimal between 0 (inclusive) and 1 (exclusive). 2. `(max - min + 1)` calculates the range of possible values, including both `min` and `max`. 3. `Math.floor(...)` rounds down the result of the calculation to the nearest integer, ensuring that the generated random number is an integer. 4. The final result, which is the randomly generated integer within the specified range, is returned by the arrow function. #### Usage ```javascript console.log(random(1, 10)); // Outputs a random integer between 1 and 10 (inclusive) ``` ### Clear all cookies ```javascript const clearCookies = () => document.cookie .split(";") .forEach( (cookie) => (document.cookie = cookie .replace(/^ +/, "") .replace(/=.*/, `=;expires=${new Date(0).toUTCString()};path=/`)) ); ``` #### Code Explanation This code defines a function called `clearCookies` that clears all cookies in the current browser session. Here's a simplified explanation of how the code works: 1. `document.cookie` retrieves the current cookie string. 2. `.split(';')` splits the cookie string into individual cookies. 3. `.forEach(cookie => ...)` iterates over each cookie. 4. `cookie.replace(/^ +/, '')` removes leading whitespace from the cookie. 5. `.replace(/=.*/, `=;expires=${new Date(0).toUTCString()};path=/`)` replaces the cookie's value with an empty value and sets its expiration date to the past, effectively deleting the cookie. By calling `clearCookies()`, all cookies will be cleared in the current browser session. #### Usage ```javascript clearCookies(); ``` > It's important to note that manipulating and clearing cookies directly in JavaScript has limitations and considerations based on domain and security settings. Therefore, it's essential to use cookie operations responsibly and be aware of any potential implications. ### Scroll to top ```javascript const goToTop = () => window.scrollTo(0, 0); ``` #### Code Explanation This code defines a constant variable named `goToTop`, which is assigned an arrow function. The arrow function has no parameters and consists of a single statement: `window.scrollTo(0, 0)`. The `window.scrollTo()` method is used to scroll the window to a specified position. In this case, it is scrolling to the coordinates (0, 0), which represents the top-left corner of the window. So, when the `goToTop` function is called, it will scroll the window to the top, bringing the top of the page into view. #### Usage ```javascript goToTop(); ``` ### Generate a random boolean ```javascript const randomBoolean = () => Math.random() >= 0.5; ``` #### Code Explanation This code defines a function called `randomBoolean` that returns a random boolean value. Here's how it works: - The `Math.random()` function generates a random decimal number between 0 (inclusive) and 1 (exclusive). - The expression `Math.random() >= 0.5` compares the generated random number with 0.5. If the random number is greater than or equal to 0.5, it will evaluate to `true`; otherwise, it will evaluate to `false`. - The function `randomBoolean` encapsulates this logic and returns the resulting boolean value. In summary, when you call the `randomBoolean` function, it will generate a random boolean value with a roughly 50% chance of being `true` and a 50% chance of being `false`. #### Usage ```javascript randomBoolean(); ``` ### Identify the type of a data ```javascript const typeOf = (data) => Object.prototype.toString.call(data).slice(8, -1).toLowerCase(); ``` #### Code Explanation This code defines a function called `typeOf` that takes a parameter called `data`. This function determines the type of the `data` object and returns it as a lowercase string. The code achieves this by using the `Object.prototype.toString.call(data)` method. This method returns a string representation of the object's type. By calling `slice(8, -1)` on the result, the code extracts the type portion from the string representation. Finally, `toLowerCase()` is used to convert the type string to lowercase. #### Usage ```javascript const myArray = [1, 2, 3]; const myObject = { name: "John", age: 25 }; const myFunction = () => { console.log("Hello!"); }; console.log(typeOf(myArray)); // Output: "array" console.log(typeOf(myObject)); // Output: "object" console.log(typeOf(myFunction)); // Output: "function" console.log(typeOf(42)); // Output: "number" console.log(typeOf("Hello")); // Output: "string" console.log(typeOf(true)); // Output: "boolean" ``` ### Calculate percent ```javascript const calculatePercent = (value, total) => Math.round((value / total) * 100); ``` #### Code Explanation This code defines a function called `calculatePercent` that takes two parameters: `value` and `total`. This function calculates the percentage of `value` relative to `total` and returns the result as a rounded whole number. Here's a breakdown of the code: 1. `Math.round((value / total) * 100)`: This expression calculates the ratio of `value` to `total` by dividing `value` by `total` and then multiplying the result by 100 to get the percentage. The `Math.round()` function is used to round the result to the nearest whole number. 2. `const calculatePercent = (value, total) =>`: This line defines the `calculatePercent` function with two parameters: `value` and `total`. These parameters represent the values for which you want to calculate the percentage. In summary, when you call the `calculatePercent` function with a specific `value` and `total`, it will calculate the percentage of `value` relative to `total` and return the result as a rounded whole number. #### Usage ```javascript const value = 75; const total = 200; const percentage = calculatePercent(value, total); console.log(percentage); // Output: 38 ``` ### Get a random item of array ```javascript const getRandomItem = (array) => array[Math.floor(Math.random() * array.length)]; ``` #### Code Explanation This code defines a function called `getRandomItem` that takes an array as a parameter. The purpose of this function is to return a random item from the given array. Here's how it works: 1. `Math.random()` generates a random floating-point number between 0 (inclusive) and 1 (exclusive). 2. `Math.random() * array.length` multiplies the random number by the length of the array. This will give a random number between 0 (inclusive) and the length of the array (exclusive). 3. `Math.floor()` rounds down the random number to the nearest whole number, ensuring it becomes a valid index within the range of the array indices. 4. `array[Math.floor(Math.random() * array.length)]` uses the calculated random index to retrieve the corresponding item from the array. 5. The retrieved item is then returned as the result of the function. In summary, this code allows you to randomly select an item from an array by generating a random index and retrieving the corresponding item from that index. #### Usage ```javascript const fruits = ["apple", "banana", "orange", "grape", "kiwi"]; const randomFruit = getRandomItem(fruits); console.log(randomFruit); ``` > please note that the code assumes that the input array is not empty. ### Sort array of objects based on the values of the specified key ```javascript const sortBy = (arr, key) => arr.sort((a, b) => (a[key] > b[key] ? 1 : a[key] < b[key] ? -1 : 0)); ``` #### Code Explanation This code defines a function called `sortBy` that takes an array (`arr`) and a key (`key`) as parameters. It sorts the array of objects based on the values of the specified key. The `sort()` method is used on the array, and a comparison function is provided as an argument. The comparison function `(a, b) => a[key] > b[key] ? 1 : a[key] < b[key] ? -1 : 0` compares two objects `a` and `b` based on the values of the specified key. If the value of `a[key]` is greater than `b[key]`, it returns `1`, indicating that `a` should come after `b`. If the value of `a[key]` is less than `b[key]`, it returns `-1`, indicating that `a` should come before `b`. If the values are equal, it returns `0`, indicating that the order of `a` and `b` should remain unchanged. By using this function, you can sort an array of objects based on a specific key in ascending order. #### Usage ```javascript const items = [ { name: "Apple", price: 2.5 }, { name: "Banana", price: 1.5 }, { name: "Orange", price: 2.0 }, { name: "Grape", price: 3.0 }, ]; const sortedItems = sortBy(items, "price"); console.log(sortedItems); // [{ name: 'Banana', price: 1.5 }, { name: 'Orange', price: 2.0 }, { name: 'Apple', price: 2.5 }, { name: 'Grape', price: 3.0 }] ``` ### Check if arrays / objects are equal ```javascript const isEqual = (a, b) => JSON.stringify(a) === JSON.stringify(b); ``` #### Code Explanation This code defines a function named `isEqual` that checks if two objects `a` and `b` are equal. It uses the `JSON.stringify` method to convert the objects to JSON strings and then compares the strings for equality using the `===` operator. Here's a breakdown of the code: 1. `const isEqual = (a, b) =>`: This declares a function named `isEqual` with two parameters `a` and `b`. The function is defined using arrow function syntax. 2. `JSON.stringify(a)`: This converts the object `a` to a JSON string representation. 3. `JSON.stringify(b)`: This converts the object `b` to a JSON string representation. 4. `===`: This is the strict equality operator that checks if the two JSON strings are exactly equal, including their characters and order. #### Usage ```javascript const obj1 = { name: "John", age: 30 }; const obj2 = { name: "John", age: 30 }; const obj3 = { name: "Jane", age: 25 }; console.log(isEqual(obj1, obj2)); // true console.log(isEqual(obj1, obj3)); // false ``` > While this approach may work for simple objects, it has limitations when dealing with complex objects or objects containing functions or circular references. The JSON.stringify method is not designed to handle all types of objects correctly. To properly check for object equality, you can use a deep comparison algorithm that traverses the object properties recursively. ### Add a new item to the array at the specified position ```javascript const insert = (arr, index, newItem) => [ ...arr.slice(0, index), newItem, ...arr.slice(index), ]; ``` #### Code Explanation This code defines a function called `insert` that takes three parameters: an array `arr`, an index `index`, and a new item `newItem`. The function uses the spread syntax (`...`) to create a new array. Here's a breakdown of the code: 1. `arr.slice(0, index)`: This extracts a portion of the original array `arr` from the beginning (index `0`) up to, but not including, the specified `index`. It represents the elements that come before the insertion point. 2. `newItem`: This is the new item that you want to insert into the array. 3. `arr.slice(index)`: This extracts a portion of the original array `arr` starting from the specified `index` to the end. It represents the elements that come after the insertion point. This code allows you to insert a new item into an array at a specific index while preserving the order of the existing elements. #### Usage ```javascript const myArray = [1, 2, 3, 4]; const newArray = insert(myArray, 2, "new item"); console.log(newArray); // [1, 2, "new item", 3, 4] ``` ### Add commas to number ```javascript const addCommasToNumber = (number) => number .toString() .split(".") .map((part, index) => index === 0 ? part.replace(/\B(?=(\d{3})+(?!\d))/g, ",") : part ) .join("."); ``` #### Code Explanation This code defines a function called `addCommasToNumber` that formats a numeric value by adding commas as thousands separators. Here's a breakdown of the code: 1. The function takes a `value` parameter, which represents the numeric value to be formatted. 2. `value.toString()` converts the input value to a string. 3. The `replace()` method is called on the string representation of the value. 4. The regular expression `/(\d)(?=(\d{3})+\.)/g` is used to match a single digit (`\d`) that is followed by groups of three digits (`(\d{3})`) and a decimal point (`\.`). 5. The positive lookahead `(?=(\d{3})+\.)/g` ensures that the matching digit is followed by multiple groups of three digits and a decimal point, but it doesn't include these characters in the matched result. 6. In the replacement parameter `"$1,"`, the `$1` represents the matched digit captured in a group, and the comma is added after the matched digit. 7. The final result is a string representation of the value with commas added as thousands separators. #### Usage ```javascript console.log(addCommasToNumber(12345.6789)); // 12,345.6789 ``` ### Flatten a nested array ```javascript const flattenArray = (arr) => arr.reduce( (flat, current) => flat.concat(Array.isArray(current) ? flattenArray(current) : current), [] ); ``` #### Code Explanation this code recursively flattens a nested array by iteratively concatenating the sub-arrays into a single flat array. Here's a breakdown of the code: 1. The `flattenArray` function takes an array (`arr`) as input. 2. Inside the function, the `reduce` method is used on the input array. 3. The `reduce` method iterates over each element of the array, accumulating a new value based on the logic provided. 4. The initial value for the accumulator is an empty array (`[]`). 5. For each element of the array, a check is performed using the `Array.isArray()` method to determine if the element is an array itself. 6. If the element is an array, the `flattenArray` function is recursively called on that sub-array. 7. If the element is not an array, it is directly concatenated to the accumulated value (`flat`). 8. The result of each iteration is the flattened array accumulated in the `flat` variable. 9. Once all elements have been processed, the `reduce` method returns the final flattened array. #### Usage ```javascript const numbers = [1, 2, [3, 4, [5, 6]], 7]; console.log(flattenArray(numbers)); // [1, 2, 3, 4, 5, 6, 7] ``` ### Check if a Number is Prime ```javascript const isPrime = (num) => num > 1 && Array.from({ length: Math.ceil(Math.sqrt(num)) }, (_, i) => i + 2).every( (i) => num % i !== 0 ); ``` #### Code Explanation This code defines a function called `isPrime` that takes a number `num` as input. It checks if the number is prime by performing the following steps: 1. The condition `num > 1` ensures that the number is greater than 1, as prime numbers are defined as greater than 1. 2. The code creates an array using `Array.from` with a length equal to the ceiling value of the square root of `num`. This array will contain numbers starting from 2 up to the square root of `num`. 3. The `every` method is called on the array, which checks if every element in the array satisfies a given condition. 4. The condition `(i) => num % i !== 0` is used to check if `num` is divisible by any of the numbers in the array (`i`). If the remainder of the division between `num` and any `i` is not equal to 0, it means that `num` is not divisible by `i`, indicating that it could be a prime number. 5. If all elements in the array satisfy the condition, the `every` method returns `true`, indicating that the number is prime. Otherwise, it returns `false`. #### Usage ```javascript console.log(isPrime(7)); // true console.log(isPrime(12)); // false console.log(isPrime(23)); // true console.log(isPrime(49)); // false ``` ### Check if a string is a palindrome ```javascript const isPalindrome = (str) => str.toLowerCase().replace(/[^a-z0-9]/g, "") === str .toLowerCase() .replace(/[^a-z0-9]/g, "") .split("") .reverse() .join(""); ``` #### Code Explanation This code checks if a given string is a palindrome. Here's a breakdown of the code: 1. `str.toLowerCase()` converts the string to lowercase to make the comparison case-insensitive. 2. `.replace(/[^a-z0-9]/g, '')` removes any non-alphanumeric characters from the string using a regular expression. This step ensures that only letters and numbers are considered in the palindrome check. 3. `===` is the equality operator that compares the modified string with the reversed version of itself. 4. `.split('').reverse().join('')` splits the modified string into an array of individual characters, reverses the order of the array, and then joins the characters back together into a string. 5. The result of the comparison is a boolean value (`true` if the string is a palindrome, `false` otherwise). #### Usage ```javascript const str1 = "A man, a plan, a canal: Panama"; const str2 = "Hello, World!"; console.log(isPalindrome(str1)); // Output: true console.log(isPalindrome(str2)); // Output: false ``` > Note that this code treats letters as case-insensitive and removes non-alphanumeric characters before performing the palindrome check.
baranacikgoz/PaginatedSearchAndFilter
https://github.com/baranacikgoz/PaginatedSearchAndFilter
A lightweight and flexible library designed to provide advanced search and advanced filter functionality along with pagination for .NET Core applications.
# PaginatedSearchAndFilter PaginatedSearchAndFilter is a lightweight and flexible library designed to provide search and filter functionality for paginated data in your .NET applications. It simplifies the implementation of search and filter features in your API endpoints, allowing you to offer advanced filtering and searching capabilities directly from the frontend. ## Table of Contents 1. [Features](#features) 1. [Installation](#installation) 2. [Usage](#usage) 3. [Examples](#examples) 4. [Contributing](#contributing) 5. [License](#license) 6. [Code of Conduct](#code-of-conduct) ## Features - Easy integration with your API endpoints - Support for JSON request bodies to encapsulate search and filter criteria - Programmatically construct and configure search requests using provided classes - Integration with LINQ and Ardalis.Specification for querying and filtering data - Seamless integration with EF Core for efficient database queries ## Installation TODO: Provide installation instructions for the package once it is published. ## Usage ### Creating the SearchRequest Object #### Option 1: JSON Request Body To use the SearchRequest object within a JSON request body, define an endpoint (controller or minimal API) that accepts a SearchRequest object as a parameter. This object will encapsulate the search and filter criteria. ```csharp public class ProductsController { [HttpPost("search")] public Task<PaginationResponse<ProductDto>> SearchAsync(SearchRequest request) { ... // use request object. } } ``` Your endpoint will accept the JSON representation of the SearchRequest: ```json { "advancedSearch": { "fields": [ "title", "description" ], "keyword": "phone" }, "keyword": "apple", "advancedFilter": { "logic": "and", "filters": [ { "field": "price.value", "operator": "gte", "value": "500" }, { "field": "category", "operator": "eq", "value": "electronics" } ] }, "pageNumber": 1, "pageSize": 10, "orderBys": [ { "field": "price.value", "isDescending": true }, { "field": "rating", "isDescending": true } ] } ``` #### Option 2: Using the Constructor in Code You can also create instances of the provided classes and configure the properties programmatically to define the search and filter options. ```csharp var searchRequest = new SearchRequest { AdvancedSearch = new AdvancedSearch { Fields = new List<string> { "title", "description" }, Keyword = "phone" }, Keyword = "apple", AdvancedFilter = new AdvancedFilter { Logic = "and", Filters = new List<Filter> { new Filter { Field = "price.value", Operator = "gte", Value = "500" }, new Filter { Field = "category", Operator = "eq", Value = "electronics" } } }, PageNumber = 1, PageSize = 10, OrderBys = new List<OrderBy> { new("price.value", true), new("rating", true) } }; ``` ## Examples ### LINQ TODO: Add examples for LINQ integration. ### Specification ```csharp public class ProductsBySearchRequestSpec : EntitiesBySearchRequestSpecification<Product, ProductDto> { public ProductsBySearchRequestSpec(SearchRequest request) : base(request) { } } public class SearchProductsRequestHandler : IRequestHandler<SearchProductsRequest, PaginationResponse<ProductDto>> { private readonly IReadRepository<Product> _repository; public SearchProductsRequestHandler(IReadRepository<Product> repository) => _repository = repository; public async Task<PaginationResponse<ProductDto>> Handle(SearchProductsRequest request, CancellationToken cancellationToken) { var spec = new ProductsBySearchRequestSpec(request); return await _repository.PaginateBySpecAsync<Product,ProductDto>(spec, cancellationToken); } } ``` ## Contributing Contributions are welcome! If you encounter any issues or have suggestions for improvements, please open an issue on the GitHub repository. Additionally, feel free to submit pull requests with any enhancements or fixes you'd like to contribute. For more details, please check out the [contributing guide](CONTRIBUTING.md). ## License This project is licensed under the terms of the [Apache License](LICENSE). ## Code of Conduct This project follows the [Contributor Covenant Code of Conduct](CODE_OF_CONDUCT.md).
ryanlintott/ILikeToMoveIt
https://github.com/ryanlintott/ILikeToMoveIt
Accessible move actions for SwiftUI Lists and easy custom drag and drop for older iOS
<picture> <source srcset="https://github.com/ryanlintott/ILikeToMoveIt/assets/2143656/fb28d9e9-7e1c-4c05-9f00-130daf64a513" media="(prefers-color-scheme: dark)"> <img width="600" src="https://github.com/ryanlintott/ILikeToMoveIt/assets/2143656/e7df51f5-f74a-4d3e-ad03-a13b77c305a9"> </picture> [![Swift Compatibility](https://img.shields.io/endpoint?url=https%3A%2F%2Fswiftpackageindex.com%2Fapi%2Fpackages%2Fryanlintott%2FILikeToMoveIt%2Fbadge%3Ftype%3Dswift-versions)](https://swiftpackageindex.com/ryanlintott/ILikeToMoveIt) [![Platform Compatibility](https://img.shields.io/endpoint?url=https%3A%2F%2Fswiftpackageindex.com%2Fapi%2Fpackages%2Fryanlintott%2FILikeToMoveIt%2Fbadge%3Ftype%3Dplatforms)](https://swiftpackageindex.com/ryanlintott/ILikeToMoveIt) ![License - MIT](https://img.shields.io/github/license/ryanlintott/ILikeToMoveIt) ![Version](https://img.shields.io/github/v/tag/ryanlintott/ILikeToMoveIt?label=version) ![GitHub last commit](https://img.shields.io/github/last-commit/ryanlintott/ILikeToMoveIt) [![Mastodon](https://img.shields.io/badge/[email protected]?style=flat)](http://mastodon.social/@ryanlintott) [![Twitter](https://img.shields.io/badge/[email protected]?style=flat)](http://twitter.com/ryanlintott) # Overview - Add [accessible move actions](#accessibilitymoveable) to any array of items in a SwiftUI List or ForEach. - Make drag-and-drop operations easier for custom types in iOS 14 and 15 using [`Providable`](#providable) - Make drag-to-create-a-new-window operations easier in iPadOS using [`UserActivityProvidable`](#useractivityprovidable) # DragAndDrop (example app) Check out the [example app](https://github.com/ryanlintott/DragAndDrop) to see how you can use this package in your iOS app. # Installation 1. In Xcode go to `File -> Add Packages` 2. Paste in the repo's url: `https://github.com/ryanlintott/ILikeToMoveIt` and select by version. # Usage Import the package using `import ILikeToMoveIt` # Platforms This package is compatible with iOS 14+ but the accessibility move feature only works for iOS 15+. # Support If you like this package, buy me a coffee to say thanks! [![ko-fi](https://ko-fi.com/img/githubbutton_sm.svg)](https://ko-fi.com/X7X04PU6T) - - - # Details ## AccessibilityMoveable *\*iOS 15+* Two modifiers are required to enable accessible move actions. One for each item and one for the list itself. ```swift List { ForEach(items) { item in Text(item.name) .accessibilityMoveable(item) } } .accessibilityMoveableList($items, label: \.name) ``` ### `.accessibilityMoveable` Adding this modifier will add accessibility actions to move the item up, down, to the top of the list and to the bottom. If you want to customize these actions you can supply your own array. Example: If you have a short list and only want up and down. ```swift .accessibilityMoveable(item, actions: [.up, .down]) ``` Example: If you have a long list and want options to move items more than one step at a time. ```swift .accessibilityMoveable(item, actions: [.up, .down, .up(5), .down(5), .toTop, .toBottom]) ``` When the user triggers an accessibility action the following results are reported back via a UIAccessibility announcement: - "moved up", "moved down", or "not moved" - "by [number of spaces]" if moved by more than one space. - "above [item label]" if moved down and "below [item label]" if moved up. Only if a label keypath is was provided. - "At top" or "At bottom" if at the top or bottom of the list. ### `.accessibilityMoveableList` This modifier applies the changes from the move actions to the list and adjusts the accessibility focus to ensure it stays on the correct item. You pass in a binding to the array of items and an optional label keypath. This label will be read out after moving an item to let the user know what item is directly below after moving up or directly above after moving down. ```swift .accessibilityMoveableList($items, label: \.name) ``` ### Known issues - Moving the same item again immediately after moving it may cause the accessibility focus to lag and another item will be moved instead. ## Providable This protocol allows for easier drag and drop for `Codable` objects in iOS 14 and 15 Drag and drop operations were made much easier in iOS 16 by the `Transferable` protocol. Older methods use `NSItemProvider` and were cumbersome to set up. ### How to use it Conform your object to `Providable`. Add readable and writable types, then add functions to transform your object to and from those types. ```swift extension Bird: Providable { static var writableTypes: [UTType] { [.bird] } static var readableTypes: [UTType] { [.bird, .plainText] } func data(type: UTType) async throws-> Data? { switch type { case .bird: return try JSONEncoder().encode(self) default: return nil } } init?(type: UTType, data: Data) throws { switch type { case .bird: self = try JSONDecoder().decode(Bird.self, from: data) case .plainText: let string = String(decoding: data, as: UTF8.self) self = Bird(name: string) default: return nil } } } ``` You will need to add any custom types to your project. Project > Target > Info > ExportedTypeIdentifiers ### Adding drag and drop operations Add a drag option to a view like this: ```swift .onDrag { bird.provider } ``` And a drop option like this: ```swift .onDrop(of: Bird.readableTypes) { providers, location in providers.loadItems(Bird.self) { bird, error in if let bird { birds.append(bird) } } return true } ``` And even an insert option like this: ```swift .onInsert(of: Bird.readableTypes) { index, providers in providers.loadItems(Bird.self) { bird, error in if let bird { birds.insert(bird, at: index) } } } ``` ## UserActivityProvidable Extension to the `Providable` protocol to add easy drag to new window (a feature not supported by `Transferable`) on iPadOS 16+ Add your activity type string to plist under `NSUserActivityTypes` and then add the same string to the activityType parameter on your codable type. ```swift extension Bird: UserActivityProvidable { static var activityType: String { "com.ryanlintott.draganddrop.birdDetail" } } ``` Use the `onContinueUserActivity` overload function that takes a `UserActivityProvidable` object to handle what your app does when opened via this activity. ```swift .onContinueUserActivity(Bird.self) { bird in /// Adjust state based on your object. } ``` You can also target a separate WindowGroup for your object. Make sure you still use `onContinueUserActivity` in your view to ensure the object gets loaded. ```swift WindowGroup { BirdDetailView() } .handlesExternalEvents(matching: [Bird.activityType]) ```
dermavan/callme-
https://github.com/dermavan/callme-
null
# callme-
malbertzard/inline-fold.nvim
https://github.com/malbertzard/inline-fold.nvim
A neovim version of the inline fold plugin
# inline-fold.nvim `inline-fold.nvim` is a Neovim plugin inspired by the vscode plugin `inline-fold`. It provides a convenient way to define patterns in files that get concealed inline. This can be used for many things for example for CSS classes in HTML files especially with TailwindCSS **Note: It doesn't work for multiline patterns.** ## Features ✨ - Toggle folding of the content - Replace the folded content with a placeholder for improved readability. - Placeholder can be defined by the user per pattern - pattern can be easily added ## Installation 💻 Use your preferred plugin manager to install `inline-fold.nvim`. ### Plug ```lua { "malbertzard/inline-fold.nvim", opts = { defaultPlaceholder = "…", queries = { -- Some examples you can use html = { { pattern = 'class="([^"]*)"', placeholder = "@" }, -- classes in html { pattern = 'href="(.-)"' }, -- hrefs in html { pattern = 'src="(.-)"' }, -- HTML img src attribute } }, } } ``` ## Usage Once installed, the plugin provides the following functionality: ### Commands - `InlineFoldToggle`: Toggles the folding of the content within CSS class attributes. ### Call `InlineFoldToggle` via `autocmd` Add the following snippet to your configurations to call `InlineFoldToggle` automatically based on filename patterns. ```lua vim.api.nvim_create_autocmd({ 'BufEnter', 'BufWinEnter' }, { pattern = { '*.html', '*.tsx' }, callback = function(_) if not require('inline-fold.module').isHidden then vim.cmd('InlineFoldToggle') end end, }) ``` ### Todo List 📝 - [ ] Check if moving to Treesitter is viable - [ ] Create some sub commands for updating and removing conceals - [ ] Record showcase ## License This project is licensed under the MIT License.
Athlon007/Leomard
https://github.com/Athlon007/Leomard
A Lemmy client for macOS
# Leomard [![](https://img.shields.io/github/downloads/Athlon007/Leomard/total)](https://github.com/Athlon007/Leomard/releases) [![](https://img.shields.io/github/v/release/Athlon007/Leomard?include_prereleases)](https://github.com/Athlon007/Leomard/releases) [![](https://img.shields.io/github/license/Athlon007/Leomard)](LICENSE.md) [![](https://img.shields.io/lemmy/leomard%40lemm.ee)](https://lemm.ee/c/leomard) <img align="right" src="Assets/Icon/mac.iconset/icon_512x512.png" alt="icon" width="128" /> Leomard is a Lemmy client for macOS, written in Swift using SwiftUI. Leomard allows you to browse Lemmy instances, view posts and comments, and create new posts and comments, read replies, and more. Visit our [community on Lemmy](https://lemm.ee/c/leomard)! I made this app, because I do not like Lemmy's web interface, and I wanted to learn Swift. Besides, I like using native apps more than web apps. This app is free and open source, is not affiliated with Lemmy or its developers, and it is developed in my free time. Please note that this app is still in early development, so bugs and issues are to be expected. ### Why the name? - Leopard, but with an M, because it's a Lemmy client (the app icon is a leopard) - Leopards are larger than lemmings - Cats are cool ## Requirements - macOS 13.0 Ventura or later ## Installation Download the .dmg file from the releases page, and drag the app to your Applications folder. Start the app, you can login by clicking "Profile" in the bottom left corner. *Note: You may be prompted to allow the app to access your Keychain. This is required to store your authorization token. It's recommended to press "Always Allow" to avoid having to enter your password every time you start the app.* *Note: macOS may warn you that the app is from an unidentified developer. This is because I don't have an Apple Developer account, and the app has not been notarized. You can still run the app by going into System Settings > Privacy & Security > Scroll to Security section > Click "Open Anyway"* ## Screenshots ![Screenshot 1](Assets/Screenshots/1.png) ![Screenshot 2](Assets/Screenshots/2.png) ![Screenshot 3](Assets/Screenshots/3.png) ![Screenshot 4](Assets/Screenshots/4.png) ![Screenshot 5](Assets/Screenshots/5.png) ## Contributing If you want to contribute to this project, you can do so by forking the repository, and then creating a pull request. You can also create an issue if you find a bug or have a feature request. Please see [CONTRIBUTING](CONTRIBUTING.md) for more information. ## Support You can support me through here: - [Ko-fi](https://ko-fi.com/athlon) - [PayPal](https://www.paypal.com/donate/?hosted_button_id=8VASR9RLLS76Y) ## License This project is licensed under GPL-3.0. See [LICENSE](LICENSE.md) for more information. ## Credits - My friend **Fluffeline** for coming up with the whole Leopard thing - App Icon by [vintprox](https://github.com/vintprox) licensed under CC-BY-SA - [swift-markdown-ui](https://github.com/gonzalezreal/swift-markdown-ui) licensed under MIT License - [Nuke](https://github.com/kean/Nuke) licensed under MIT License
aswin2108/Portfolio
https://github.com/aswin2108/Portfolio
Welcome to my online resume
## My Portfolio Welcome to my [portfolio](https://aswin2108.github.io/Portfolio/)! This project showcases my skills, projects, and contributions. It serves as an online resume and a platform to highlight my accomplishments. Do star the repo if you liked it...! ## Table of Contents - [My Portfolio](#my-portfolio) - [Table of Contents](#table-of-contents) - [Features](#features) - [Technologies Used](#technologies-used) - [Installation](#installation) - [Usage](#usage) - [Contributing](#contributing) ## Features - **Project Showcase**: View my featured projects with descriptions, screenshots, and links. - **Skills and Expertise**: Discover the technologies and skills I have experience with. - **Open Source Contributions**: Explore my contributions to open source projects and organizations. - **About Me**: Learn more about me, my background, and my interests. - **Contact Information**: Find ways to get in touch with me. ## Technologies Used - HTML - CSS - JavaScript - React.js - styled-components - ... ## Installation 1. Clone the repository: `git clone https://github.com/aswin2108/Portfolio.git` 2. Navigate to the project directory: `cd Portfolio` 3. Install the dependencies: `yarn install` ## Usage 1. Start the development server: `yarn start` 2. Open your browser and visit: `http://localhost:3000` ## Contributing Contributions are welcome! If you have any ideas, suggestions, or bug fixes, please open an issue or submit a pull request.
abacaj/code-eval
https://github.com/abacaj/code-eval
Run evaluation on LLMs using human-eval benchmark
# code-eval ## What This is a repo I use to run human-eval on code models, adjust as needed. Some scripts were adjusted from wizardcoder repo ([process_eval.py](https://github.com/nlpxucan/WizardLM/blob/main/WizardCoder/src/process_humaneval.py)). The evaluation code is duplicated in several files, mostly to handle edge cases around model tokenizing and loading (will clean it up). ## Results Table is sorted by pass@1 score. | model | size | pass@1 | pass@10 | screenshot | | ----------------------------------------------------------------------------------------------------- | ---- | ------- | ------- | ------------------------------------------------------------------------------------------------------------------ | | [sahil2801/replit-code-instruct-glaive](https://huggingface.co/sahil2801/replit-code-instruct-glaive) | 3B | 63.5% | 67% | ![instruct-glaive](https://github.com/abacaj/code-eval/assets/7272343/6fd7527d-0dc4-4b48-8a57-ad0373074bc5) | | [WizardCoder-15B-V1.0](https://huggingface.co/WizardLM/WizardCoder-15B-V1.0) | 15B | 57% | 68.9% | ![wizardcoder](https://github.com/abacaj/code-eval/assets/7272343/0b941ff8-b474-4236-bbc0-89d925bbd34e) | | [bigcode/starcoder](https://huggingface.co/bigcode/starcoder) | 15B | 34.6% | 48.7% | ![starcoder](https://github.com/abacaj/code-eval/assets/7272343/eb5df978-f56b-4557-a433-8b8fa863a059) | | [openchat/opencoderplus](https://huggingface.co/openchat/opencoderplus) | 15B | 27.3% | 43.9% | ![opencoder](https://github.com/abacaj/code-eval/assets/7272343/1fa9f5ef-941b-4ea8-981e-c3f258c03fee) | | [teknium/Replit-v1-CodeInstruct-3B](https://huggingface.co/teknium/Replit-v1-CodeInstruct-3B) | 3B | 25.8% | 42.6% | ![replit-codeinstruct-v1](https://github.com/abacaj/code-eval/assets/7272343/4fca98d8-2c22-43ce-9639-e998ecb4fedc) | | [teknium/Replit-v2-CodeInstruct-3B](https://huggingface.co/teknium/Replit-v2-CodeInstruct-3B) | 3B | 21.5% | 31% | ![replit-codeinstruct-v2](https://github.com/abacaj/code-eval/assets/7272343/655aaa1d-0715-4fcd-b9ba-a22b5fddb215) | | [replit-code-v1-3b](https://huggingface.co/replit/replit-code-v1-3b) | 3B | 17.1% | 29.8% | ![replit-code-v1](https://github.com/abacaj/code-eval/assets/7272343/6b387aa8-db60-4f04-b458-35b010b1145c) | | [mpt-7b](https://huggingface.co/mosaicml/mpt-7b) | 7B | 15.9% | 23.7% | ![mpt-7b](https://github.com/abacaj/code-eval/assets/7272343/16965905-a368-4254-aeab-5e44126eba84) | | [xgen-7b-8k-base](https://huggingface.co/Salesforce/xgen-7b-8k-base) | 7B | 14.9% | 22.5% | ![xgen-7b-8k-base](https://github.com/abacaj/code-eval/assets/7272343/995c84a9-ee69-43bf-8502-a74eba1d927a) | | [openllama-7b-v2](https://huggingface.co/openlm-research/open_llama_7b) | 7B | 14% | 23.1% | ![openllama-7b-v2](https://github.com/abacaj/code-eval/assets/7272343/e38f08a0-ae74-4c51-b3a7-638781477e1b) | | [llama-2-7b](https://huggingface.co/meta-llama/Llama-2-7b-hf) | 7B | 13.1% | 21.9% | ![llama-2-7b](https://github.com/abacaj/code-eval/assets/7272343/cc86cc7c-beac-4993-9ca3-d91a48a790e4) | | [llama-7b](https://huggingface.co/huggyllama/llama-7b) | 7B | 12.1% | 18.9% | ![llama-7b](https://github.com/abacaj/code-eval/assets/7272343/605a3c4e-0b2b-4c10-a185-f2a4d34ec10d) | | [mpt-30b](https://huggingface.co/mosaicml/mpt-30b) | 30B | pending | pending | pending | ## FAQ > Why is there a discrepancy on some of the scores between official numbers? Because it is not obvious or published what prompt or processing the official models used to conduct their evaluation on this benchmark. The goal here is to try and best reproduce those numbers, in many cases it is possible to get very close to the published numbers. All of the scores here were run independently of any published numbers and are reproducible by cloning the repo and following the setup. > Why do some models have a filter_code post generation step? Base models can in many cases repeat outputs, breaking the benchmark scores. Instruct models don't have this problem and so you won't see this step, they tend to output a end of sequence token. ## Setup Create python environment ```sh python -m venv env && source env/bin/activate ``` Install dependencies ```sh pip install -r requirements.txt ``` Run the eval script ```sh # replace script file name for various models: # eval_wizard.py # eval_opencode.py # eval_mpt.py # eval_starcoder.py # eval_replit.py # eval_replit_glaive.py # eval_replit_instruct.py python eval_wizard.py ``` Process the jsonl file to extract code samples from model completions. **Note**: Only wizard & opencoder require this, they return markdown output with code. ```sh # replace args for various models: # --path results/wizard --out_path results/wizard/eval.jsonl # --path results/opencode --out_path results/opencode/eval.jsonl python process_eval.py --path results/wizard --out_path results/wizard/processed.jsonl --add_prompt ``` Then get the results ```sh # replace args for various models: # results/wizard/processed.jsonl # results/starcoder/eval.jsonl # results/mpt/eval.jsonl # results/opencode/processed.jsonl # results/replit_instruct/eval.jsonl # results/replit_glaive/eval.jsonl # results/replit/eval.jsonl evaluate_functional_correctness results/wizard/processed.jsonl ```
iamlooper/VIC
https://github.com/iamlooper/VIC
Your personalized conversational companion with dynamic personas, changeable conversational style and real-time information.
![VIC](https://github.com/iamlooper/VIC/raw/main/vic.png) # VIC (Versatile Intelligent Chatbot) 🧠 Your personalized conversational companion with dynamic personas, changeable conversational style and real-time information. ## Download 📲 You can download VIC from the following: - [Play Store](https://play.google.com/store/apps/details?id=com.looper.vic) - [Pling](https://www.pling.com/p/2059807) - [Buy Me a Coffee](https://buymeacoffee.com/iamlooper/posts) (Early access) ## Description 📝 Introducing VIC (Versatile Intelligent Chatbot), a personalized AI assistant designed to cater to your every need. With dynamic personas and a changeable conversation style, this companion adapts to your preferences, ensuring a truly customized experience. Stay connected to the world with real-time information at your fingertips. With internet access, our companion provides up-to-date news, weather updates, and much more, keeping you informed and empowered. Experience the convenience of having a versatile and engaging conversational partner who understands your unique requirements. Whether you're seeking a friendly chat or need assistance with tasks, our companion is always ready to lend a helping hand. Discover a new level of interaction and connectivity with our personalized conversational companion. Embrace the future of AI-powered companionship and unlock a world of possibilities. ## Screenshots 📱 [<img src="https://github.com/iamlooper/VIC/raw/main/screenshots/ss_01.png" width=140>](https://github.com/iamlooper/VIC/raw/main/screenshots/ss_01.png) [<img src="https://github.com/iamlooper/VIC/raw/main/screenshots/ss_02.png" width=140>](https://github.com/iamlooper/VIC/raw/main/screenshots/ss_02.png) [<img src="https://github.com/iamlooper/VIC/raw/main/screenshots/ss_03.png" width=140>](https://github.com/iamlooper/VIC/raw/main/screenshots/ss_03.png) [<img src="https://github.com/iamlooper/VIC/raw/main/screenshots/ss_04.png" width=140>](https://github.com/iamlooper/VIC/raw/main/screenshots/ss_04.png) ## Support Me 💙 If you liked any one of my projects then consider supporting me via following: - [Buy Me a Coffee](https://buymeacoffee.com/iamlooper/membership) - [Telegram Channel](https://loopprojects.t.me) ## Credits 👥 Due to the combined efforts and expertise of the following people, this project has achieved its success: - [Aerath Xlythe](https://t.me/KaldirimMuhendisi) - [Emulond Argent](https://t.me/Emulond) - [Chirag](https://t.me/selfmuser) - [leafinferno](https://t.me/leafinferno) - [Quick ⚡](https://t.me/username0136) - [decipher](https://t.me/decipher3114) - [NMPS](https://t.me/NAVISAVIO) - [Jis G Jacob](https://t.me/StudioKeys) - [DrDisagree](https://t.me/DrDisagree) Message me if I missed anyone. 😉 ## Licenses 📄 - [Kotlin](https://kotlinlang.org) (Apache 2.0) - [AndroidX](https://developer.android.com/jetpack/androidx) (Apache 2.0) - [Material components for Android](https://github.com/material-components/material-components-android) (Apache 2.0) - [Markwon](https://github.com/noties/Markwon) (Apache 2.0) - [Better Link Movement Method](https://github.com/saket/Better-Link-Movement-Method) (Apache 2.0) - [OkHttp](https://square.github.io/okhttp) (Apache 2.0) - [RikkaX](https://github.com/RikkaApps/RikkaX) (MIT) - [LoadingDots for Android](https://github.com/EyalBira/loading-dots) (MIT)
yamoo9/ssafy-react-2023
https://github.com/yamoo9/ssafy-react-2023
React 모듈형 특강 - 삼성 청년 아카데미 2023. 7
# SSAFY, React 삼성 청년 아카데미, React 모듈형 특강 (2023. 7)
bschaatsbergen/dnsee
https://github.com/bschaatsbergen/dnsee
Check DNS configurations quickly
# dnsee [![Release](https://github.com/bschaatsbergen/dnsee/actions/workflows/goreleaser.yaml/badge.svg)](https://github.com/bschaatsbergen/dnsee/actions/workflows/goreleaser.yaml) ![GitHub release (latest SemVer)](https://img.shields.io/github/v/release/bschaatsbergen/dnsee) ![GitHub commits since latest release (by SemVer)](https://img.shields.io/github/commits-since/bschaatsbergen/dnsee/latest) [![Go Reference](https://pkg.go.dev/badge/github.com/bschaatsbergen/dnsee.svg)](https://pkg.go.dev/github.com/bschaatsbergen/dnsee) ![GitHub all releases](https://img.shields.io/github/downloads/bschaatsbergen/dnsee/total) Check DNS configurations quickly ## Brew To install dnsee using brew, simply do the below. ```sh brew tap bschaatsbergen/dnsee brew install dnsee ``` ## Binaries You can download the [latest binary](https://github.com/bschaatsbergen/dnsee/releases/latest) for Linux, MacOS, and Windows. ## Examples Using `dnsee` is very simple. ### Fetch all records To get all records for a domain name: ``` $ dnsee google.com A google.com. 146 142.251.39.110 AAAA google.com. 300 2a00:1450:400e:803::200e MX google.com. 193 smtp.google.com. 10 NS google.com. 103 ns1.google.com. NS google.com. 103 ns4.google.com. NS google.com. 103 ns3.google.com. NS google.com. 103 ns2.google.com. ``` ### Fetch all records for a specific type To get all records for a domain name of a specific type: ``` $ dnsee google.com -q A A google.com. 146 142.251.39.110 ``` ### Fetch all records using a different DNS server To get all records for a domain name using a different DNS server: ``` $ dnsee gooogle.com --dns-server-ip 1.1.1.1 A google.com. 146 142.251.39.110 AAAA google.com. 300 2a00:1450:400e:803::200e MX google.com. 193 smtp.google.com. 10 NS google.com. 103 ns1.google.com. NS google.com. 103 ns4.google.com. NS google.com. 103 ns3.google.com. NS google.com. 103 ns2.google.com. ``` ## Contributing Contributions are highly appreciated and always welcome. Have a look through existing [Issues](https://github.com/bschaatsbergen/dnsee/issues) and [Pull Requests](https://github.com/bschaatsbergen/dnsee/pulls) that you could help with.
roipoussiere/jaffle
https://github.com/roipoussiere/jaffle
An hybrid yaml/node editor for Tidal Cycles, based on Strudel.
# Jaffle **J**ust **a**nother **f**lat **f**ile **l**ivecoding **e**nvironment. <img style="display: block; margin: auto;" src="./website/static/images/jaffle_logo_128.png"/> Check out the [demo website](https://roipoussiere.frama.io)! ## Introduction Jaffle is an hybrid yaml/node editor for [Tidal-cycles](https://tidalcycles.org/), based on [Strudel](https://strudel.tidalcycles.org). It aims to make algorithmic music more accessible for people who are not familiar with programming languages. ![](images/editor_and_graph.png) Under the hood, Jaffle is a transpiler generating JavaScript code, which is then interpreted by Strudel. ## Features - tune visualization - easy to read yaml syntax (albeit ususally a bit more verbose than Strudel, because less compact) - use lambda functions, expressions, variable definitions, [and more](https://roipoussiere.frama.io/jaffle/syntax/) - includes all Strudel examples converted into Jaffle syntax - extensive [unit tests](tests/transpiler.test.ts) ## Roadmap - make the graph editable and allow to switch between yaml and node editor - improve node editor layout - allow to import strudel tune - create a desktop app - code completion - own documentation ## Syntax Go to the [syntax section](https://roipoussiere.frama.io/jaffle/syntax) in the Jaffle website (or alternatively read [its source](./website/content/syntax.md) if necessary). ## Technical stack Jaffle stands on the shoulders of giants: - music code interpreter: [Strudel](https://strudel.tidalcycles.org/) - graph renderer: [D3](https://d3js.org/) - text editor: [CodeMirror](https://codemirror.net/) - website generator: [Hugo](https://gohugo.io/) - unit tests framework: [Jest](https://jestjs.io) - linter: [ESLint](https://eslint.org/) ## Contributing See the [contribution guide](./CONTRIBUTING.md)! ## Authorship ### Jaffle transpiler and website - credits: Nathanaël Jourdane and contributors - license: [AGPL-3.0](./LICENSE) - source: https://framagit.org/roipoussiere/jaffle ### Strudel engine - credits: Strudel contributors - license: [AGPL-3.0](https://www.gnu.org/licenses/agpl-3.0.txt) - source: https://github.com/tidalcycles/strudel ### Pre-loaded sounds - piano: - credits: Alexander Holm - license: [CC-by](http://creativecommons.org/licenses/by/3.0) - source: https://archive.org/details/SalamanderGrandPianoV3 - VCSL: - credits: Versilian Studios LLC - license: [CC0](https://creativecommons.org/publicdomain/zero/1.0/) - source: https://github.com/sgossner/VCSL - Tidal drum machines: - source: https://github.com/ritchse/tidal-drum-machines - EmuSP12: - source: https://github.com/tidalcycles/Dirt-Samples
TechTitan0624/next-proejct-good
https://github.com/TechTitan0624/next-proejct-good
null
# An open source live screen share and webcam video recorder - [An open source live screen share and webcam video recorder](#an-open-source-live-screen-share-and-webcam-video-recorder) - [Deploy your instance](#deploy-your-instance) - [Features](#features) - [Contributions](#contributions) - [Requirements](#requirements) - [Code structure](#code-structure) - [How to run](#how-to-run) - [.env contents](#env-contents) - [How to build](#how-to-build) - [TO DO](#to-do) ## Deploy your instance 1. Deploy the backend project to some cloud (heroku etc) to consume websockets. 1. Deploy the frontend project to vercel with single click [![Deploy with Vercel](https://vercel.com/button)](https://vercel.com/new/git/external?repository-url=https%3A%2F%2Fgithub.com%2Ftechnikhil314%2Fnext-webrtc) 1. Add required env variables - NEXT_PUBLIC_WEBSOCKET_URL - URL to your websocket server. The backend deployment. If you want to use meeting feature - NEXT_PUBLIC_URL - URL of your site - NODE_ENV - `development` or `production` ## Features - Background removal - Virtual backgrounds - Background blur ## Contributions Welcome :) ### Requirements 1. Node >= v15.x 2. npm >= v7 3. yarn 1.x ### Code structure 1. It is a monorepo managed by yarn 2. There are two packages backend and frontend 3. Backend is just a small websocket server used for signaling 4. Frontend is actual UI built with nextjs 5. It uses google stun server for populating ice candidates ### How to run 1. Install all dependencies using `yarn --frozen-lockfile` 2. add `.env` see the contents [below](#env-contents) 3. run backend using `yarn workspaces @openrtc/backend start` 4. run frontend using `yarn workspaces @openrtc/frontend start` ### .env contents ```bash NEXT_PUBLIC_WEBSOCKET_URL=wss://localhost:4000/ NODE_ENV=development NEXT_PUBLIC_URL=http://localhost:3000 ``` ### How to build 1. You dont need to build backend 1. You can build frontend package with `yarn workspaces @openrtc/frontend build` command ## TO DO 1. Handle if someone stops screen share in recording vlog - This currently stops recording - Ideally it should keep on recording but change video from screen to user 1. First time visitor faces lag in audio recording 1. First time visitor can not see small PiP video 1. Microphone volume control 1. Ability to name the video file - Currently it uses ISO date time string
tetrzim/diffusion-human-feedback
https://github.com/tetrzim/diffusion-human-feedback
Censored Sampling of Diffusion Models Using 3 Minutes of Human Feedback
# Censored Sampling of Diffusion Models Using 3 Minutes of Human Feedback This is the codebase for [**Censored Sampling of Diffusion Models Using 3 Minutes of Human Feedback**](https://arxiv.org/abs/2307.02770). This repository is based on [openai/guided-diffusion](https://github.com/openai/guided-diffusion) and includes the following features: - Implementation of the human feedback framework - Integration of the `latent_guided_diffusion` module, which is derived from [CompVis/latent-diffusion](https://github.com/CompVis/latent-diffusion) with modification to perform guided sampling. - Most components used in our experiments related to guidance is based on [arpitbansal297/Universal-Guided-Diffusion](https://github.com/arpitbansal297/Universal-Guided-Diffusion) with some necessary modifications. ## Contents - [Installation](#installation) - [Experiment 1. MNIST 7 (censoring crossed 7s)](#experiment-1-mnist-7) - [Experiment 2. LSUN Church (censoring stock watermarks)](#experiment-2-lsun-church) - [Experiment 3. ImageNet Tench (censoring human faces)](#experiment-3-imagenet-tench) - [Experiment 4. LSUN Bedroom (censoring broken images)](#experiment-4-lsun-bedroom) # Installation Make sure you have Python version **3.9** installed. To install the required dependencies, run the following command: ``` pip install -e . ``` This will install the `guided_diffusion` python package that the scripts depend on. You also need to separately install some packages. Run the following command to install them. ``` pip install torchvision omegaconf einops ``` # Experiment 1. MNIST 7 ## 1.1 Prepare training data Run the following command: ```sh python datasets/mnist_7.py ``` The script will download the MNIST train dataset, select only the images of the digit 7, resize the images to 32x32, and save them into the `mnist_7` directory. <!-- If you want to use your own datset, you can place your images in a directory with the extensions ".jpg", ".jpeg", or ".png". The training code will be able to read them if you pass the `--data_dir` argument pointing to the directory when running the training script. --> ## 1.2 Train diffusion model Run the following shell script to train the DDPM model on MNIST 7s: ```sh MODEL_FLAGS="--image_size 32 --image_channels 1 --num_channels 128 --num_res_blocks 3" DIFFUSION_FLAGS="--diffusion_steps 1000 --noise_schedule linear" TRAIN_FLAGS="--lr 1e-4 --batch_size 256 --save_interval 100000" LOG_DIR="path/to/log" # The diffusion model will be saved in .pt format within the directory specified by this path. NUM_GPUS="1" # The number of GPUs used in parallel computing. If larger than 1, adjust the batch_size argument accordingly. echo $(mpiexec -n $NUM_GPUS python scripts/image_train.py --log_dir=$LOG_DIR --data_dir=mnist_7 --rgb=False --random_flip=False $MODEL_FLAGS $DIFFUSION_FLAGS $TRAIN_FLAGS) ``` ## 1.3 Prepare human feedback data for reward model training ### 1.3.1 Generate and save baseline samples Run the following shell script to generate baseline samples: ```sh MODEL_FLAGS="--image_size 32 --image_channels 1 --num_channels 128 --num_res_blocks 3" DIFFUSION_FLAGS="--diffusion_steps 1000 --noise_schedule linear" SAMPLE_FLAGS="--batch_size 250 --num_samples 2000" MODEL_PATH="path/to/diffusion/model.pt" LOG_DIR="path/to/log" # The NPZ file containing all sample data and individual sample images (in PNG format) will be saved within this directory. NUM_GPUS="8" echo $(mpiexec -n $NUM_GPUS python scripts/image_sample.py --log_dir $LOG_DIR --model_path $MODEL_PATH $MODEL_FLAGS $DIFFUSION_FLAGS $SAMPLE_FLAGS) ``` One may also use the following command to manually convert the NPZ sample file into PNG images and save them into a designated path. ```sh python scripts/save_samples_as_files.py \ --sample_path path/to/sample.npz \ --save_dir path/to/baseline/sample/dir # Each sample image will be saved in PNG format within this directory. ``` ### 1.3.2 Provide human feedback on baseline samples using GUI Run the following command to run our GUI-based human feedback collector, until you find desired number (10, to reproduce our experiments) of malign images. of The provided labels will comprise the train data for the **reward model**: ```sh python scripts/hf/collect_feedback.py \ --data_dir path/to/baseline/sample/dir \ --feedback_path path/to/total_feedback.pkl \ --censoring_feature strike-through-cross \ --resolution some-integer-value # Resolution in which each image will be displayed (default 150) --grid_row some-integer-value # Number of rows in the image grid to be displayed --grid_col some-integer-value # Number of columns in the image grid ``` The provided human labels will be saved into the file `total_feedback.pkl` within the specified directory. The PKL file stores a dictionary, whose keys are paths to generated image files from baseline sampling and values are binary labels 0 or 1, where 0 indicates malign and 1 indicates benign (when the user is not sure, `None` label can be provided). ### 1.3.3 Create partial data for ensemble training Run the following command to create the partial PKL files for training reward ensemble: ```sh python scripts/hf/select_partial_feedback.py \ --all_feedback_path path/to/total_feedback.pkl --out_feedback_path path/to/partial_feedback.pkl \ --num_malign_samles 10 \ --num_benign_samles 10 ``` **Note**: If `partial_feedback.pkl` already exists at the `out_feedback_path`, the new feedback information will be appended to it instead of overwriting the existing data. <!-- To create 5 datasets for ensemble, you can start by selecting 10 malign samples and creating 5 copies of each. Then, select an additional 10 benign samples for each subset. --> To reproduce the ablation study (the **Union** model case), run the following shell script to merge (union) multiple feedback files: ```sh FEEDBACK_PATH_1="path/to/partial/feedback/for/training_1.pkl" FEEDBACK_PATH_2="path/to/partial/feedback/for/training_2.pkl" FEEDBACK_PATH_3="path/to/partial/feedback/for/training_3.pkl" FEEDBACK_PATH_4="path/to/partial/feedback/for/training_4.pkl" FEEDBACK_PATH_5="path/to/partial/feedback/for/training_5.pkl" OUT_DIR="path/to/save/union_feedback.pkl" echo $(python scripts/hf/union_feedback.py --feedback_paths $FEEDBACK_PATH_1 $FEEDBACK_PATH_2 $FEEDBACK_PATH_3 $FEEDBACK_PATH_4 $FEEDBACK_PATH_5 --out_union_feedback_dir $OUT_DIR) ``` ## 1.4 Train reward models Run the following shell script to train reward models: ```sh REWARD_FLAGS="--image_size 32 --image_channels 1 --classifier_attention_resolutions 16,8,4 --classifier_depth 2 --classifier_width 128 --classifier_pool attention --classifier_resblock_updown True --classifier_use_scale_shift_norm True --output_dim 1" DIFFUSION_FLAGS="--diffusion_steps 1000 --noise_schedule linear" TRAIN_FLAGS="--augment_mnist True --num_augment 10 --iterations 1001 --anneal_lr True --lr 3e-4 --batch_size 128 --save_interval 1000 --weight_decay 0.05" # Change the 'iterations' flag to 3001 for the 'Union' case POS_WEIGHT="0.02" # Change this to 0.005 for the 'Union' case FEEDBACK_PATH="path/to/partial_feedback.pkl" AUGMENT_DATA_DIR="path/to/save/temporary/augmented/images" LOG_DIR="path/to/log" # The reward model will be saved in .pt format within this directory. NUM_GPUS="1" # Change this freely. echo $(mpiexec -n $NUM_GPUS python scripts/reward/reward_train.py --log_dir=$LOG_DIR --pos_weight=$POS_WEIGHT --augment_data_dir=$AUGMENT_DATA_DIR --feedback_path=$FEEDBACK_PATH $REWARD_FLAGS $TRAIN_FLAGS $DIFFUSION_FLAGS) ``` The `POS_WEIGHT` parameter corresponds to $\alpha$ within the weighted BCE loss $BCE_{\alpha}$. To train the **Union** model, change `POS_WEIGHT` argument to 0.005 and `FEEDBACK_PATH` to `path/to/union_feedback.pkl`. ## 1.5 Perform censored sampling Appropriately modify the following template script to perform censored sampling. ```sh MODEL_FLAGS="--image_size 32 --image_channels 1 --num_channels 128 --num_res_blocks 3" DIFFUSION_FLAGS="--diffusion_steps 1000 --noise_schedule linear" REWARD_FLAGS="--classifier_attention_resolutions 16,8,4 --classifier_depth 2 --classifier_width 128 --classifier_pool attention --classifier_resblock_updown True --classifier_use_scale_shift_norm True" SAMPLE_FLAGS="--num_recurrences 1 --backward_steps 0 --optim_lr 0.0002 --use_forward False --original_guidance True --original_guidance_wt 1.0 --batch_size 200 --num_samples 1000" MODEL_PATH="path/to/diffusion/model.pt" REWARD_PATH_1="path/to/reward/model_1.pt" REWARD_PATH_2="path/to/reward/model_2.pt" REWARD_PATH_3="path/to/reward/model_3.pt" REWARD_PATH_4="path/to/reward/model_4.pt" REWARD_PATH_5="path/to/reward/model_5.pt" LOG_DIR="path/to/log" # The NPZ file containing all sample data and individual sample images (in PNG format) will be saved within this directory. NUM_GPUS="5" # When backward/recurrence is used, set this to 1. echo $(mpiexec -n $NUM_GPUS python scripts/censored_sample.py --log_dir $LOG_DIR --model_path $MODEL_PATH --reward_paths $REWARD_PATH_1 $REWARD_PATH_2 $REWARD_PATH_3 $REWARD_PATH_4 $REWARD_PATH_5 $MODEL_FLAGS $REWARD_FLAGS $DIFFUSION_FLAGS $SAMPLE_FLAGS) ``` The above example ensembles 5 reward models. To reproduce our ablation study using the **Single** and **Union** models, simply remove REWARD_PATH_2 through REWARD_PATH_5 and change the value of `original_guidance_wt` to `5.0`. Change `backward_steps` ($B$ in the paper), `optim_lr` (backward guidance learning rate), and `num_recurrences` ($R$ in the paper) as desired. Note that when `backward_steps` takes a positive value, `NUM_GPUS` should be set to `1`. <!-- To enable the recurrence and backward options, set the `--num_recurrences` to 4, the `--backward_steps` to 5, and the `--original_guidance_wt` to 1. --> # Experiment 2. LSUN Church ## 2.1 Environment Setup A seperate environment should be set up for experiments in this section, due to the use of components pertaining to [`latent-diffusion`](https://github.com/CompVis/latent-diffusion) repository. Setup a conda environment `ldm` using a provided YAML file within the `latent_guided_diffusion` directory: ``` cd latent_guided_diffusion conda env create -f enviroment.yaml conda activate ldm ``` ## 2.2 Download pretrained latent diffusion model Download the pretrained LDM components from [CompVis/latent-diffusion](https://github.com/CompVis/latent-diffusion). ### Pretrained Autoencoding Models ```sh cd latent_guided_diffusion sh scripts/download_first_stages_kl-f8.sh ``` The first stage model will be downloaded within `latent_guided_diffusion/models/first_stage_models/kl-f8`. ### Pretrained LDM ```sh sh scripts/download_church_model.sh ``` The latent diffusion model will be downloaded within `latent_guided_diffusion/models/ldm/lsun_churches256`. ## 2.3 Prepare human feedback data for reward model training ### 2.3.1 Generate and save baseline samples Run the following shell script to generate baseline samples: ```sh MODEL_FLAGS="--attention_resolutions 32,16,8 --class_cond True --image_size 256 --image_channels 3 --num_channels 256 --learn_sigma True --num_heads 4 --num_res_blocks 2 --resblock_updown True --use_fp16 True --use_scale_shift_norm True" DIFFUSION_FLAGS="--diffusion_steps 1000 --noise_schedule linear" SAMPLE_FLAGS="--use_ldm True --timestep_respacing 400 --num_recurrences 1 --backward_steps 0 --use_forward False --batch_size 8 --num_samples 1000" MODEL_PATH="path/to/ldm/church/model/ckpt" # If one faithfully follows the above guidelines, then the default path will be "latent_guided_diffusion/models/ldm/lsun_churches256/model.ckpt" LDM_CONFIG_PATH="latent_guided_diffusion/configs/latent-diffusion/lsun_churches-ldm-kl-8.yaml" # Do not change this unless you know what you're doing! LOG_DIR="path/to/log" NUM_GPUS="8" # Although we run censored_sample.py, we are not censoring anything here. echo $(mpiexec -n $NUM_GPUS python scripts/censored_sample.py --log_dir $LOG_DIR --model_path $MODEL_PATH --ldm_config_path $LDM_CONFIG_PATH $MODEL_FLAGS $DIFFUSION_FLAGS $SAMPLE_FLAGS) ``` As in [Section 1.3.1](#131-generate-and-save-baseline-samples), the sampling code automatically converts the NPZ sample file into PNG images and saves them. ### 2.3.2 Labeling with GUI Same as in [Section 1.3.2](#132-provide-human-feedback-on-baseline-samples-using-gui). Just change the value input to the `censoring_feature` argument to `shutterstock-watermark` (instead of `strike-through-cross` used in that section) for clarity. ### 2.3.3 Create partial data for ensemble training Same as in [Section 1.3.3](#133-create-partial-data-for-ensemble-training) except that `num_malign_samples` and `num_benign_samples` should be set to 30, instead of 10. ## 2.4 Train Reward Models Run the following shell script to train reward models: ```sh TRAIN_FLAGS="--augment_lsun False --image_size 256 --rgb True --iterations 601 --anneal_lr True --lr 3e-4 --batch_size 128 --save_interval 200 --weight_decay 0.05" # Change the 'iterations' flag to 1801 for the 'Union' case POS_WEIGHT="0.1" # Change this to 0.01 for the 'Union' case FEEDBACK_PATH="path/to/partial_feedback.pkl" NUM_GPUS="1" LOG_DIR="path/to/log" echo $(mpiexec -n $NUM_GPUS python scripts/reward/reward_train_transfer.py --log_dir=$LOG_DIR --pos_weight=$POS_WEIGHT --feedback_path=$FEEDBACK_PATH $TRAIN_FLAGS) ``` <!-- You can adjust the `--isaugment` flag to enable or disable data augmentation during training. If you set `--isaugment True`, make sure to adjst the `--p_malgin_transfrom` and `--p_benign_transfrom` values based on the type of data augmentation you want to apply, such as MNIST and Tench. --> <!-- You can adjust the `pos_weight` parameter to modify the $BCE_{\alpha}$ value and control the balance between positive and negative samples during training. To obtain the union reward model, you can change the `feedback_path` to the `union_feedback` pickle file. --> ## 2.5 Perform censored sampling Appropriately modify the following template script to perform censored sampling. ```sh MODEL_FLAGS="--attention_resolutions 32,16,8 --class_cond True --image_size 256 --image_channels 3 --num_channels 256 --learn_sigma True --num_heads 4 --num_res_blocks 2 --resblock_updown True --use_fp16 True --use_scale_shift_norm True" DIFFUSION_FLAGS="--diffusion_steps 1000 --noise_schedule linear" SAMPLE_FLAGS="--time_dependent_reward False --use_ldm True --timestep_respacing 400 --num_recurrences 4 --backward_steps 0 --use_forward True --forward_guidance_wt 2.0 --batch_size 8 --num_samples 1000" MODEL_PATH="path/to/ldm/church/model/ckpt" REWARD_PATH_1="path/to/reward/model_1.pt" REWARD_PATH_2="path/to/reward/model_2.pt" REWARD_PATH_3="path/to/reward/model_3.pt" REWARD_PATH_4="path/to/reward/model_4.pt" REWARD_PATH_5="path/to/reward/model_5.pt" LDM_CONFIG_PATH="latent_guided_diffusion/configs/latent-diffusion/lsun_churches-ldm-kl-8.yaml" LOG_DIR="path/to/log" NUM_GPUS="4" echo $(mpiexec -n $NUM_GPUS python scripts/censored_sample.py --log_dir $LOG_DIR --model_path $MODEL_PATH --reward_paths $REWARD_PATH_1 $REWARD_PATH_2 $REWARD_PATH_3 $REWARD_PATH_4 $REWARD_PATH_5 --ldm_config_path $LDM_CONFIG_PATH $MODEL_FLAGS $DIFFUSION_FLAGS $SAMPLE_FLAGS) ``` The above example ensembles 5 reward models. To reproduce our ablation study using the **Single** and **Union** models, simply remove REWARD_PATH_2 through REWARD_PATH_5 and change the value of `original_guidance_wt` to `10.0`. For this setup, we do not use backward guidance so we set $B=0$. One may change `num_recurrences` ($R$ in the paper) as desired. # Experiment 3. ImageNet Tench ## 3.1 Download pretrainined diffusion model and classifier Download the following checkpoints provided by the OpenAI's Guided Diffusion repo: * 128x128 classifier: [128x128_classifier.pt](https://openaipublic.blob.core.windows.net/diffusion/jul-2021/128x128_classifier.pt) * 128x128 diffusion: [128x128_diffusion.pt](https://openaipublic.blob.core.windows.net/diffusion/jul-2021/128x128_diffusion.pt) ## 3.2 Prepare human feedback data for reward model training ### 3.2.1 Generate and save baseline samples Sample generation: ```sh MODEL_FLAGS="--attention_resolutions 32,16,8 --class_cond True --image_size 128 --image_channels 3 --num_channels 256 --learn_sigma True --num_heads 4 --num_res_blocks 2 --resblock_updown True --use_fp16 True --use_scale_shift_norm True" DIFFUSION_FLAGS="--diffusion_steps 1000 --noise_schedule linear" CLASSIFIER_FLAGS="--classifier_scale 0.5 --output_dim 1000 --classifier_attention_resolutions 32,16,8 --classifier_depth 2 --classifier_width 128 --classifier_pool attention --classifier_resblock_updown True --classifier_use_scale_shift_norm True" SAMPLE_FLAGS="--batch_size 128 --num_samples 1000 --target_class 0" MODEL_PATH="path/to/diffusion.pt" CLASSIFIER_PATH="path/to/classifier.pt" LOG_DIR="path/to/log" NUM_GPUS="4" echo $(mpiexec -n $NUM_GPUS python scripts/classifier_sample.py --log_dir $LOG_DIR --model_path $MODEL_PATH --classifier_path $CLASSIFIER_PATH $MODEL_FLAGS $CLASSIFIER_FLAGS $DIFFUSION_FLAGS $SAMPLE_FLAGS) ``` As in Section 1.3.1, the sampling code automatically converts the NPZ sample file into PNG images and saves them. ### 3.2.2 Provide human feedback on baseline samples using GUI Same as in [Section 1.3.2](#132-provide-human-feedback-on-baseline-samples-using-gui). Just change the value input to the `censoring_feature` argument to `human-face` (instead of `strike-through-cross` used in that section) for clarity. ### 3.2.3 Create partial data for round 1 of imitation learning Run: ```sh python scripts/hf/select_partial_feedback.py \ --all_feedback_path path/to/total_feedback.pkl --out_feedback_path path/to/partial_feedback.pkl \ --num_malign_samles 10 \ --num_benign_samles 10 ``` By adjusting `num_malign_samples` and `num_benign_samples` to 20 or 30, one can also prepare data for training non-imitation learning models used in the ablation study. ## 3.3 Train Reward Models ### 3.3.1 Reward training script Run the following shell script to train reward models: ```sh REWARD_FLAGS="--image_size 128 --output_dim 1 --classifier_attention_resolutions 32,16,8 --classifier_depth 2 --classifier_width 128 --classifier_pool attention --classifier_resblock_updown True --classifier_use_scale_shift_norm True" DIFFUSION_FLAGS="--diffusion_steps 1000 --noise_schedule linear" TRAIN_FLAGS="--augment_imgnet True --num_augment 20 --p_benign_transform 1 1 1 --p_malign_transform 1 1 1 --rgb True --iterations 501 --save_interval 500 --anneal_lr True --lr 3e-4 --batch_size 32 --weight_decay 0.05" # Change the 'iterations' flag to 1501 and 3001 (respectively) for rounds 2 and 3 of imitation learning, and the corresponding non-imitation ablation. POS_WEIGHT="0.1" FEEDBACK_PATH="path/to/partial_feedback.pkl" AUGMENT_DATA_DIR="path/to/save/temporary/augmented/images" LOG_DIR="path/to/log" NUM_GPUS="4" # This is a recommended setup. One may reduce this if one has sufficient GPU memory, but in that case increase the 'batch_size' accordingly. RESUME_CHECKPOINT="" # This should be used for imitation learning. echo $(mpiexec -n $NUM_GPUS python scripts/reward/reward_train.py --resume_checkpoint=$RESUME_CHECKPOINT --log_dir=$LOG_DIR --pos_weight=$POS_WEIGHT --augment_data_dir=$AUGMENT_DATA_DIR --feedback_path=$FEEDBACK_PATH $REWARD_FLAGS $TRAIN_FLAGS $DIFFUSION_FLAGS) ``` ### 3.3.2 Imitation learning To perform imitation learning, first follow the directions of [Section 3.4](#34-perform-censored-sampling) to collect censored samples using the previous round's reward model. Then repeat the procedure of [Section 3.2.2](#322-provide-human-feedback-on-baseline-samples-using-gui) to create a feedback file (named, say, `new_fb.pkl`) containing at least 10 malign/benign samples each. Next, create a copy of the PKL file used for the last round's reward training (named, say, `copy.pkl`). Then run: ```sh python scripts/hf/select_partial_feedback.py --all_feedback_path path/to/new_fb.pkl --out_feedback_path path/to/copy.pkl --num_malign_samples 10 --num_benign_samples 10 ``` After this, `copy.pkl` will contain all feedback information relevant to the new round's imitation learning. Now we re-run the script of [Section 3.3.1](#331-reward-training-script), with `FEEDBACK_PATH` being `path/to/copy.pkl` and `RESUME_CHECKPOINT` being `path/to/reward_model_from_last_round.pt`. For round 2, set `iterations` to `1501`. For round 3, set `iterations` to `3001`. ## 3.4 Perform censored sampling Run: ```sh MODEL_FLAGS="--attention_resolutions 32,16,8 --class_cond True --image_size 128 --image_channels 3 --num_channels 256 --learn_sigma True --num_heads 4 --num_res_blocks 2 --resblock_updown True --use_fp16 True --use_scale_shift_norm True" DIFFUSION_FLAGS="--diffusion_steps 1000 --noise_schedule linear" REWARD_FLAGS="--classifier_attention_resolutions 32,16,8 --classifier_depth 2 --classifier_width 128 --classifier_pool attention --classifier_resblock_updown True --classifier_use_scale_shift_norm True" SAMPLE_FLAGS="--num_recurrences 1 --classifier_scale 0.5 --backward_steps 0 --optim_lr 0.01 --use_forward False --original_guidance True --original_guidance_wt 5.0 --batch_size 50 --num_samples 200 --target_class 0" MODEL_PATH="path/to/diffusion/model.pt" CLASSIFIER_PATH="path/to/classifier/model.pt" REWARD_PATHS="path/to/reward/model.pt" LOG_DIR="path/to/log" NUM_GPUS="4" echo $(mpiexec -n $NUM_GPUS python scripts/censored_sample.py --log_dir $LOG_DIR --model_path $MODEL_PATH --classifier_path $CLASSIFIER_PATH --reward_paths $REWARD_PATH $MODEL_FLAGS $REWARD_FLAGS $DIFFUSION_FLAGS $SAMPLE_FLAGS) ``` For backward guidance and recurrence (to be combined with round 3), adjust the `backward_steps` ($B$ in the paper), `optim_lr` (backward guidance learning rate) and `num_recurrences` ($R$ in the paper). # Experiment 4. LSUN bedroom ## 4.1 Download pretrained model Download the following checkpoint provided by the OpenAI's Guided Diffusion repo: * LSUN bedroom: [lsun_bedroom.pt](https://openaipublic.blob.core.windows.net/diffusion/jul-2021/lsun_bedroom.pt) ## 4.2 Prepare human feedback data for reward model training ### 4.2.1 Generate and save baseline samples Sample generation: ```sh MODEL_FLAGS="--attention_resolutions 32,16,8 --class_cond False --diffusion_steps 1000 --dropout 0.1 --image_size 256 --learn_sigma True --noise_schedule linear --num_channels 256 --num_head_channels 64 --num_res_blocks 2 --resblock_updown True --use_fp16 True --use_scale_shift_norm True" SAMPLE_FLAGS="--batch_size 16 --num_samples 5000" MODEL_PATH="path/to/diffusion/model.pt" LOG_DIR="path/to/log" NUM_GPUS="4" echo $(mpiexec -n $NUM_GPUS python scripts/image_sample.py --log_dir $LOG_DIR $MODEL_FLAGS --model_path $MODEL_PATH $SAMPLE_FLAGS) ``` As in [Section 1.3.1](#131-generate-and-save-baseline-samples), the sampling code automatically converts the NPZ sample file into PNG images and saves them. ### 4.2.2 Provide human feedback on baseline samples using GUI Same as in [Section 1.3.2](#132-provide-human-feedback-on-baseline-samples-using-gui). Just change the value input to the `censoring_feature` argument to `broken-artifacts` (instead of `strike-through-cross` used in that section) for clarity. ### 4.2.3 Create partial data for ensemble training Same as in [Section 1.3.3](#133-create-partial-data-for-ensemble-training) except that `num_malign_samples` and `num_benign_samples` should be set to 100, instead of 10. ## 4.3 Train Reward Models Run: ```sh TRAIN_FLAGS="--augment_lsun False --image_size 256 --rgb True --iterations 5001 --anneal_lr True --lr 3e-4 --batch_size 128 --save_interval 1000 --weight_decay 0.05" # Change the 'iterations' flag to 15001 for the 'Union' case POS_WEIGHT="0.1" # Change this to 0.02 for the 'Union' case FEEDBACK_PATH="path/to/feedback.pkl" LOG_DIR="path/to/log" NUM_GPUS="1" # Change this freely. echo $(mpiexec -n $NUM_GPUS python scripts/reward/reward_train_transfer.py --log_dir=$LOG_DIR --pos_weight=$POS_WEIGHT --feedback_path=$FEEDBACK_PATH $TRAIN_FLAGS) ``` <!-- You can adjust the `--isaugment` flag to enable or disable data augmentation during training. If you set `--isaugment True`, make sure to adjst the `--p_malgin_transfrom` and `--p_benign_transfrom` values based on the type of data augmentation you want to apply, such as MNIST and Tench. You can adjust the `pos_weight` parameter to modify the $BCE_{\alpha}$ value and control the balance between positive and negative samples during training. To obtain the union reward model, you can change the `feedback_path` to the `union_feedback` pickle file. --> ## 4.4. Perform censored sampling Appropriately modify the following template script to perform censored sampling. ```sh MODEL_FLAGS="--attention_resolutions 32,16,8 --class_cond False --dropout 0.1 --image_size 256 --learn_sigma True --num_channels 256 --num_head_channels 64 --num_res_blocks 2 --resblock_updown True --use_fp16 True --use_scale_shift_norm True" DIFFUSION_FLAGS="--diffusion_steps 1000 --noise_schedule linear" SAMPLE_FLAGS="--time_dependent_reward False --num_recurrences 1 --backward_steps 0 --optim_lr 0.002 --use_forward True --forward_guidance_wt 2 --batch_size 10 --num_samples 100" MODEL_PATH="path/to/diffusion/model.pt" REWARD_PATH_1="path/to/reward/model_1.pt" REWARD_PATH_2="path/to/reward/model_2.pt" REWARD_PATH_3="path/to/reward/model_3.pt" REWARD_PATH_4="path/to/reward/model_4.pt" REWARD_PATH_5="path/to/reward/model_5.pt" LOG_DIR="path/to/log" NUM_GPUS="1" # When backward/recurrence is used, this must be 1. echo $(mpiexec -n $NUM_GPUS python scripts/censored_sample.py --log_dir $LOG_DIR --model_path $MODEL_PATH --reward_paths $REWARD_PATH_1 $REWARD_PATH_2 $REWARD_PATH_3 $REWARD_PATH_4 $REWARD_PATH_5 $MODEL_FLAGS $REWARD_FLAGS $DIFFUSION_FLAGS $SAMPLE_FLAGS) ``` The above example ensembles 5 reward models. To reproduce our ablation study using the **Single** and **Union** models, simply remove REWARD_PATH_2 through REWARD_PATH_5 and change the value of `original_guidance_wt` to `10.0`. Change `backward_steps` ($B$ in the paper), `optim_lr` (backward guidance learning rate), and `num_recurrences` ($R$ in the paper) as desired. Note that when `backward_steps` takes a positive value, `NUM_GPUS` should be set to `1`. # 5. Appendix ## Errors during environment setup regarding `mpi4py` 1. If you encounter any missing package errors, use `pip` to install those packages until you no longer receive the errors related to the `mpi4py` package. If installing `mpi4py` using `pip` doesn't work, try: ```sh conda install -c conda-forge mpi4py mpich ``` 2. Once the necessary packages are installed, your environment should be ready to use. Avoid loading the `cuda/1X.X` module when running this repository. If the `module list` command displays `cuda/1X.X`, try unloading it using: ```sh module unload cuda/1X.X ```
X-D-Lab/MindChat
https://github.com/X-D-Lab/MindChat
🐋MindChat(漫谈): 心理大模型
<h1 align="center">🐋MindChat(漫谈): 心理大模型</h1> <p align="center"> <a href="https://github.com/X-D-Lab/MindChat"><img src="https://img.shields.io/badge/GitHub-24292e" alt="github"></a> <a href="https://huggingface.co/X-D-Lab"><img src="https://img.shields.io/badge/-HuggingFace-yellow" alt="HuggingFace"></a> <a href="https://modelscope.cn/organization/X-D-Lab"><img src="https://img.shields.io/badge/ModelScope-blueviolet" alt="modelscope"></a> <a href="https://openi.pcl.ac.cn/XD-LAB/MindChat"><img src="https://img.shields.io/badge/-OpenI-337AFF" alt="OpenI"></a> <a href="https://www.bilibili.com/video/BV1tz4y1x79N/?share_source=copy_web&vd_source=8162f92b2a1a94035ca9e4e0f6e1860a"><img src="https://img.shields.io/badge/-bilibili-ff69b4" alt="bilibili"></a> <a href="https://openxlab.org.cn/apps/detail/thomas-yanxin/MindChat"><img src="https://img.shields.io/badge/-OpenXLab-1074B0" alt="OpenXLab"></a> </p> <div align="center"> [![GitHub license](https://img.shields.io/github/license/X-D-Lab/MindChat )](https://github.com/X-D-Lab/MindChat/blob/main/LICENSE) [![GitHub Stars](https://img.shields.io/github/stars/X-D-Lab/MindChat)](https://github.com/X-D-Lab/MindChat/stargazers) [![GitHub Forks](https://img.shields.io/github/forks/X-D-Lab/MindChat)](https://github.com/X-D-Lab/MindChat/fork) [![GitHub Contributors](https://img.shields.io/github/contributors/X-D-Lab/MindChat)](https://github.com/X-D-Lab/MindChat/graphs/contributors) </div> <div align=center><img src ="./assets/image/logo-github.png"/></div> ## 💪 模型进展 **🔥更好的模型永远在路上!🔥** * Jul 23, 2023: 提供MindChat体验地址: [MindChat-创空间](https://modelscope.cn/studios/X-D-Lab/MindChat/summary)、[![Open in OpenXLab](https://cdn-static.openxlab.org.cn/app-center/openxlab_app.svg)](https://openxlab.org.cn/apps/detail/thomas-yanxin/MindChat), 欢迎体验 * Jul 21, 2023: MindChat-InternLM-7B训练完成, 在**模型安全、共情输出、人类价值观对齐**等方面进行针对性强化; * Jul 15, 2023: MindChat-Baichuan-13B训练完成, 作为**首个百亿级参数的心理大模型**正式开源; * Jul 9, 2023: MindChat-beta训练完成, 并正式开源; * Jul 6, 2023: 首次提交MindChat(漫谈)心理大模型; ## 👏 模型介绍 **MindChat(漫谈): 心理大模型**(简称: MindChat)旨在从心理咨询、心理评估、心理诊断、心理治疗四个维度帮助人们解决心理问题, 提高心理健康水平. 作为一款心理大模型, MindChat通过营造轻松、开放的交谈环境, 以放松身心、交流感受或分享经验的方式, 与用户建立信任和理解的关系. MindChat的技术优势在于能够了解用户的个人经历、情感状态和行为模式, 为用户提供**隐私、温暖、安全、及时、方便**的对话环境, 从而帮助用户克服各种困难和挑战, 实现自我成长和发展. MindChat的创新之处在于它不仅是一款高效的心理咨询工具, 预期能够提供心理评估、心理诊断和心理治疗的服务. MindChat的专业团队拥有丰富的心理学背景和经验, 他们的专业知识和技能可以帮助用户更好地理解自己的情感和行为, 从而制定更有效的治疗方案. 无论是在工作场所还是在个人生活中, MindChat期望通过自身的努力和专业知识, 在**严格保护用户隐私的前提下**, **全时段全天候**为用户提供全面的心理支持和诊疗帮助, 同时实现自我成长和发展, 以期为建设一个更加**健康、包容和平等**的社会贡献力量. [![](./assets/image/dialogue.gif)](https://modelscope.cn/studios/X-D-Lab/MindChat/summary) ## 🔥 模型列表 | 模型名称 | 合并后的权重 | | :----: | :----: | | MindChat-InternLM-7B | [ModelScope](https://modelscope.cn/models/X-D-Lab/MindChat-7B/summary) / [HuggingFace](https://huggingface.co/X-D-Lab/MindChat-7B) / [OpenXLab](https://openxlab.org.cn/models/detail/thomas-yanxin/MindChat-InternLM-7B) | | MindChat-Baichuan-13B | [ModelScope](https://modelscope.cn/models/X-D-Lab/MindChat-Baichuan-13B/summary) / [HuggingFace](https://huggingface.co/X-D-Lab/MindChat-baichuan-13B) / OpenXLab | 目前, MindChat已经开源了**MindChat-Baichuan-13B**版本的百亿级MindChat心理大模型. 更为优质的MindChat模型将在不久的未来对外开源开放. 敬请期待! 此外, 本团队同时关注人们的身理健康, 建有安全、可靠、普惠的[中文医疗大模型孙思邈(Sunsimiao)](https://github.com/X-D-Lab/Sunsimiao), 欢迎下载使用, 敬请批评指证! ## 数据词云 目前MindChat的开源模型采用了经过人工清洗的**约20万条的高质量多轮心理对话数据**进行训练, 涵盖**工作、家庭、学习、生活、社交、安全**等多个方面 ![](./assets/image/word_cloud.png) 后续我们将不断完善数据积累以及数据清洗, 争取为开源社区贡献一份高质量的多轮心理对话数据, 共同推动开源心理大模型的建设. ## 📖 对话案例 目前我们对模型进行了**学习**、**家庭**、**工作**、**朋友**、**安全**这五个方面进行了测试, 后续将整理并提供更为全面的评测结果. 以下是部分案例的摘录, 更多对话案例及比较请见[**此处**](./assets/Dialogue_cases/). ### 🥺 学习 | 轮数 | 用户 | MindChat | | :----: | :----: | :----: | |1|总感觉自己学习很差, 想努力, 但是又控制不住想玩游戏 | 我明白你的感受, 你觉得自己学习不好, 但是总是控制不住想玩游戏. 我想问问你, 你玩游戏的时候是什么感觉呢? 有没有什么特别喜欢的游戏?| |2|我平常特别喜欢玩一些端游来打发时间, 玩游戏的时候很纠结, 明明自己成绩落后, 学习任务还没完成但还是想玩 | 我明白你的感受. 你可能需要认真思考一下, 玩游戏对你来说是不是真的值得投入这么多时间和精力. 学习落后是很正常的事情, 但不要让游戏成为你逃避现实的工具. 你可以尝试制定一个计划, 把玩游戏的时间留给自己, 同时也要合理分配学习任务, 让自己更加充实. | |3|在学习上总感觉自己学的很吃力, 不知道该怎么办, 觉得自己学习能力太差了, 心里很失落 | 我理解你的感受. 每个人的学习能力都是不同的, 不要过分苛求自己. 你可以先尝试找到适合自己的学习方法, 比如寻找学习伙伴, 向老师请教, 或者参加一些学习小组. 同时, 也不要忘记给自己一些休息的时间, 保持身心健康. 学习是一个长期的过程, 不要过于急躁, 慢慢来, 你会越来越好的. | |4|我确实不该把精力都是投入在游戏这些娱乐上, 但是我已经落后很多了, 我现在再去开始学习, 以我这么差的学习能力我还有机会学好吗 | 虽然你现在的学习状态很糟糕, 但并不是没有机会. 学习是一个长期的过程, 只要你持之以恒, 坚持不懈, 就一定会有所进步. 同时, 也不要忘记, 学习的过程也需要一定的动力和动机. 你可以找一些自己感兴趣的学习方向, 或者设定一些小目标, 让自己更有动力去完成. 不要轻易放弃, 相信自己, 你一定可以做到. | |5|可是我很焦虑, 以前落后太多了, 现在不知道从哪里开始 | 焦虑是很正常的情绪, 不要害怕它. 你可以先想想自己目前的学习状态, 列出自己需要学习的科目和内容, 然后按照优先级和难易程度进行排序. 接着, 你可以尝试分解每个科目和内容, 把它们分成小目标, 一步一步地去完成. 这样, 你就可以更有针对性地进行学习, 不会感到太紧张和不知所措. 同时, 也不要忘记给自己一些奖励, 比如看一部电影, 吃一顿美食等等, 让自己更有动力去完成目标. | |6|也就是我现在主要的问题是太焦虑了是吗? 我只要好好的去做, 成绩还是可以提高的对吗 | 是的, 焦虑是一种很常见的情绪, 它会让你感到很不安, 但是它并不会影响你真正地提高成绩. 你可以试着把焦虑的情绪转化成动力, 让它成为你学习的一部分. 同时, 也不要忘记给自己一些时间, 不要过分压抑自己的情绪, 让自己保持一个积极的心态. 只要你持之以恒, 相信自己, 你一定可以取得更好的成绩. | |7|非常谢谢你, 我会好好努力的 | 我不用客气, 我很高兴能帮到你. 我相信你一定可以克服困难, 取得好成绩. 加油! 如果有任何需要帮助的地方, 随时都可以找我聊天. | ### 👨‍👩‍👧‍👦 家庭 | 轮数 | 用户 | MindChat | | :----: | :----: | :----: | |1| 我总感觉爸妈不理解我, 感觉和他们之间有很大的隔阂. | 我能理解你的感受. 和家人的沟通和理解是家庭关系中非常重要的一环. 你是否曾经尝试过和他们分享你的想法和感受, 让他们更好地了解你呢? | |2|我有尝试找他们沟通, 但是他们听不进去我说话 , 认为我是在无理取闹, 我心里很无助 | 我明白你的感受. 有时候我们的家人可能无法理解我们的想法和感受, 这时候我们需要耐心和沟通技巧来帮助他们更好地理解我们. 你可以试着换个方式表达你的想法, 或者找到一些共同话题来促进交流. 同时, 也要记得保持冷静和尊重, 不要使用攻击性的语言来伤害他们. | |3|我也知道他们是为了我好, 平时他们督促我学习也是为了我的未来, 我能体谅他们, 但是有时候自己也会有压力, 想和他们沟通他们就会觉得我在偷懒 | 我能理解你的感受. 有时候我们的家人可能认为我们在偷懒, 但是其实他们只是担心我们的未来和健康. 你可以试着跟他们分享你的压力和焦虑, 让他们了解你的感受. 同时, 也要表达出你对他们的理解和感激, 让他们知道你是愿意和他们一起面对问题, 共同找到解决方案的. | |4|我平时会表现的乖巧, 因为我觉得他们也不容易, 我该怎么去找一个双方都适合的方式去沟通呢 | 很好的问题. 沟通是一个相互的过程, 需要双方的努力和理解. 你可以试着找一些共同话题来促进交流, 同时也要听取他们的想法和建议. 如果你觉得自己的表达方式不够清晰, 可以尝试用一些简单的词汇和表达方式来让他们更容易理解. 最重要的是要保持耐心和尊重, 相信你们会找到适合彼此的方式. | |5|可是我会觉得找他们沟通是一种不懂事的表现, 他们很辛苦我还给他们增加负担, 但是自己忍着难过心里也很压抑 | 我能理解你的感受. 有时候我们会有这样的想法, 但是其实我们不应该让这些想法限制我们的行动. 沟通是解决问题的一种方式, 同时也是增进家庭关系的重要途径. 如果你觉得自己的心情很压抑, 可以试着找一些放松的方式, 比如运动、听音乐或者写日记来缓解压力. 同时也要记得, 你的家人是关心你的, 他们不会增加你的负担, 而是希望你能够更好地成长和幸福. | |6|可能是我的想法有点偏执了, 我应该找一个时间和爸妈好好交流一下 | 这是个好主意. 和家人的交流是非常重要的, 可以让你更好地了解他们的想法和感受. 你可以试着找一个安静的时间, 让他们知道你需要和他们谈一谈, 同时也要准备好自己的态度和想法. 相信通过交流, 你们可以更好地理解彼此, 增进感情. 加油. | |7|谢谢你的建议, 我的心情好多了 | 不用客气, 我很高兴能帮到你. 记得, 沟通和理解是家庭关系中非常重要的一环, 让我们一起为建立更加和谐的家庭关系而努力吧! | 【注:】所有的对话案例生成内容均为一次性生成内容, 未进行多次重新选择后摘录. ## 🙇‍ ‍致谢 本项目由**华东理工大学 薛栋教授课题组**发起, 并受到以下平台及项目的大力支持, 在此表示感谢! 1. [OpenI启智社区](https://openi.pcl.ac.cn/): 提供模型训练算力 2. [魔搭ModelScope](https://modelscope.cn/home)、[OpenXLab](https://openxlab.org.cn/home): 模型存储和体验空间 3. [InternLM](https://github.com/InternLM/InternLM)、[Baichuan](https://github.com/baichuan-inc)提供非常优秀的基础模型 特别感谢**合肥综合性国家科学中心人工智能研究院普适心理计算团队 孙晓教授**对本项目的专业性指导. ``` @misc{2023internlm, title={InternLM: A Multilingual Language Model with Progressively Enhanced Capabilities}, author={InternLM Team}, howpublished = {\url{https://github.com/InternLM/InternLM-techreport}}, year={2023} } ``` ## 👏 欢迎 1. 针对不同用户需求和应用场景, 我们也热情欢迎商业交流和合作, 为各位客户提供个性化的开发和升级服务! 2. 欢迎专业的心理学人士对MindChat进行专业性指导和需求建议, 鼓励开源社区使用并反馈MindChat, 促进我们对下一代MindChat模型的开发. 3. MindChat模型对于学术研究完全开放, 但需要遵循[GPL-3.0 license](./LICENSE)将下游模型开源并[引用](#🤝-引用)本Repo. 对MindChat模型进行商用, 请通过组织主页邮箱发送邮件进行细节咨询. ## ⚠️ 免责申明 本仓库所有开源代码及模型均遵循[GPL-3.0](./LICENSE)许可认证. 目前开源的MindChat模型可能存在以下局限: 1. **MindChat**存在固有的局限性, 可能产生错误的、有害的、冒犯性的或其他不良的输出. 用户在关键或高风险场景中应谨慎行事, 不要使用这些模型作为最终决策参考, 以免导致人身伤害、财产损失或重大损失. 2. **MindChat**在任何情况下, 作者、贡献者或版权所有者均不对因软件或使用或其他软件交易而产生的任何索赔、损害赔偿或其他责任(无论是合同、侵权还是其他原因)承担责任. 3. 使用**MindChat**即表示您同意这些条款和条件, 并承认您了解其使用可能带来的潜在风险. 您还同意赔偿并使作者、贡献者和版权所有者免受因您使用**MindChat**而产生的任何索赔、损害赔偿或责任的影响. ## 🤝 引用 ``` @misc{MindChat, author={Xin Yan, Dong Xue*}, title = {MindChat: Psychological Large Language Model}, year = {2023}, publisher = {GitHub}, journal = {GitHub repository}, howpublished = {\url{https://github.com/X-D-Lab/MindChat}}, } ``` ## 🌟 Star History [![Star History Chart](https://api.star-history.com/svg?repos=X-D-Lab/MindChat&type=Date)](https://star-history.com/#X-D-Lab/MindChat&Date)
binghe001/spring-redis
https://github.com/binghe001/spring-redis
📚 基于Redis解决缓存击穿、穿透和雪崩问题的通用解决方案,拿来即用。支持存储对象、集合、简单数据类型等。无需提前将数据存入Redis,直接使用提供的分布式缓存接口查询数据即可,附带完善的单元测试用例,方便学习使用。
## 作者简介 - :dog: 互联资深技术专家,[TVP腾讯云最具价值专家](https://cloud.tencent.com/tvp/member/669)、数据库技术专家,分布式与微服务架构专家,全网45万+粉互联网技术博主。 - :bus: 作品:[`BingheGuide | 冰河指南`](https://github.com/binghe001/BingheGuide) | [`《深入理解高并发编程:核心原理与案例实战》`](https://github.com/binghe001/mykit-concurrent-principle) | [`《深入理解高并发编程:JDK核心技术》`](https://github.com/binghe001/mykit-concurrent-jdk) | [`数据同步`](https://github.com/binghe001/mykit-data) | [`Spring核心技术`](https://github.com/binghe001/spring-annotation-book) | [`分布式限流`](https://github.com/binghe001/mykit-ratelimiter) | [`分布式锁`](https://github.com/binghe001/mykit-lock) | [`分布式缓存`](https://github.com/binghe001/mykit-cache) | [`异步并行框架`](https://github.com/binghe001/mykit-async) | [`分布式事务`](https://github.com/binghe001/mykit-transaction-message) | [`简易版IM`](https://github.com/binghe001/mykit-chat) | [`微信SDK`](https://github.com/binghe001/mykit-wechat-sdk) | [`延迟队列`](https://github.com/binghe001/mykit-delay) | [`分布式ID`](https://github.com/binghe001/mykit-serial) | [更多搜索...](https://github.com/binghe001?tab=repositories) - :seedling: 干货:[公众号『 冰河技术 』](https://img-blog.csdnimg.cn/20210426115714643.jpg) - :pencil: 博客:[binghe.gitcode.host](https://binghe.gitcode.host/) - 硬核文章,应有尽有! - :tv: 视频:[B站 冰河技术](https://space.bilibili.com/517638832) - :love_letter: 微信:[hacker_binghe](images/hacker_binghe.jpg) - 备注来意 - :feet: 我的知识星球:[手写企业级中间件项目、大厂高并发秒杀系统、并发编程、性能调优、框架源码、分布式、微服务、1对1解答、答辩晋升技巧、定期直播](https://binghe.gitcode.host/md/zsxq/introduce.html) ## 今年的努力 ✨ <img align="" height="137px" src="https://github-readme-stats.vercel.app/api?username=binghe001&hide_title=true&hide_border=true&show_icons=true&include_all_commits=true&line_height=21&bg_color=0,EC6C6C,FFD479,FFFC79,73FA79&theme=graywhite&locale=cn" /><img align="" height="137px" src="https://github-readme-stats.vercel.app/api/top-langs/?username=binghe001&hide_title=true&hide_border=true&layout=compact&bg_color=0,73FA79,73FDFF,D783FF&theme=graywhite&locale=cn" /> ## 我出版的图书 - 2020年出版[《海量数据处理与大数据技术实战》](https://u.jd.com/K8HYjMB) —— 涵盖企业级大数据处理方方面面的知识,并从零搭建并开发离线批处理系统和在线实时计算系统。 - 2020年出版[《MySQL技术大全:开发、优化与运维实战》](https://u.jd.com/KuH2dOD) —— 程序员与运维工程师必备的700多页的MySQL工具书,涵盖MySQL基础、开发、优化、运维和架构方方面面的知识。 - 2021年出版[《深入理解分布式事务:原理与实战》](https://u.jd.com/KbHp4dI) —— 全网首部系统、深入讲解分布式事务的书籍,涵盖:分布式事务基础、解决方案、原理、源码与实战等篇章,真正从企业实际需求出发,系统、深入的讲解分布式事务,来自京东、阿里、腾讯、蚂蚁金服、滴滴、饿了么、58集团、IBM等互联网大厂及Apache软件基金会的近20位专家高口碑力荐。 - 2022年出版[《深入理解高并发编程:核心原理与案例实战》](https://u.jd.com/KuH2Qhe) —— 全面、系统、深入的讲解高并发技术,涵盖:并发基础、原理、实战与架构的方方面面知识。来自阿里巴巴、蚂蚁金服、京东、高德、CSDN、饿了么、58集团等互联网大厂以及多家互联网独角兽企业的近20位专家高口碑力荐。 - 2022年出版[《深入高平行開發:深度原理&專案實戰》](https://u.jd.com/K8H0mUV) —— 《深入理解高并发编程:核心原理与案例实战》的繁体版书籍。 - 2023年出版[《深入理解高并发编程:JDK核心技术》](https://u.jd.com/izMwOkE)——全面、系统、深入的讲解JDK高并发编程技术,涵盖:JDK并发基础、核心工具和线程池核心技术等篇章,深度剖析了JDK中各种并发类库和线程池的核心源码,并提供手写线程池案例。来自阿里巴巴、腾讯、蚂蚁金服、京东、高德、CSDN、饿了么、58集团等互联网大厂以及多家互联网独角兽企业的近20位专家高口碑力荐。 ## 我的部分电子书 - [《深入理解高并发编程(第2版)》](https://mp.weixin.qq.com/s/E5XZW_HSqKBbgxix6xQGAQ) —— 全书共计 八 大篇章,433页,耗时 4 个月完成,全网最全的高并发编程知识手册,涵盖:学习路线、源码分析、JDK核心类库、基础案例、实战案例、性能优化、面试、和系统架构等篇章,发布当月下载量即突破5万,全网累计下载30万+。 - [《深入理解高并发编程(第1版)》](https://mp.weixin.qq.com/s/yPUzyssTJ3Jv9prljhrmrQ) —— 全书共计 五 大篇章节,392页,耗时6个月完成。涵盖源码分析、基础案例、实战案例、面试和系统架构等篇章,全网累计下载65万+。 - 《[SpringCloud Alibaba实战](https://mp.weixin.qq.com/s/nuK5qC0TCQ6UYlyv0Y0m8A)》—— 全书共计 十 大篇章,26个章节,332页,涵盖:专栏设计、微服务介绍、微服务环境搭建、服务治理、服务容错、服务网关、链路追踪、消息服务、服务配置和分布式事务等篇章,全网累计下载10万+。 - 《[冰河的渗透实战笔记](https://mp.weixin.qq.com/s/maXLSjgqTkkv4-BvYFkgzA)》—— 冰河整理的全网首个开源的以实战案例为背景的渗透实战笔记,全书共442页,共计37万字(不计空格)。整本书的内容涵盖:Kali基础、渗透工具、木马制作、钓鱼链接生成、爆破密码、内存溢出攻击、web渗透、数据提权、社会工程学,全网累计下载20万+。 - 《[从零开始写RPC框架(基础版)](https://mp.weixin.qq.com/s/Z13jGNY94rEewp1HW9yiPQ)》 —— 在RPC框架中会实现同步调用、异步调用、回调调用、单向调用和泛化调用。框架中会使用大量的SPI技术来实现扩展性,其中,整个框架会采用微内核、插件化的开发模式,最终真正做一款可在实际场景使用的RPC框架。 - [《MySQL核心技术》](https://mp.weixin.qq.com/s/jd7vTSZ7wqj3BMLRlhn3DA) —— 全书共19章、16万字,涵盖了MySQL核心基础知识的方方面面内容,非常适合MySQL数据库初学者入门,也适合MySQL技术高手系统化的梳理MySQL基础知识,及时查漏补缺,全网累计下载5万+。 - [《Spring IOC核心技术》](https://mp.weixin.qq.com/s/5wCnprKkMRx0Pd1gDImKtg) —— 全书共27章,19万字,涵盖Spring IOC最核心的技术,全网累计下载8万+。 ## 联系我 添加微信:[hacker_binghe](images/hacker_binghe.jpg) - 备注来意 ## 关于星球 带你学习,一起成长,专注互联网高并发、高性能、高可用,高可扩展实战型项目和手写中间件项目,以及分享各种硬核技术,从项目中顺带总结各种技术知识点和面试面经。不忘初心,努力成为一个简单、干净而纯粹的技术人,为读者提供有价值的实战型项目和中间件项目以及各种硬核技术。 **加入星球,你将获得:** **1.项目学习**:微服务入门必备的SpringCloud Alibaba实战项目、手写RPC项目—所有大厂都需要的项目【含上百个经典面试题】、深度解析Spring6核心技术—只要学习Java就必须深度掌握的框架【含数十个经典思考题】、Seckill秒杀系统项目—进大厂必备高并发、高性能和高可用技能 **2.框架源码**:手写RPC项目—所有大厂都需要的项目【含上百个经典面试题】、深度解析Spring6核心技术—只要学习Java就必须深度掌握的框架【含数十个经典思考题】。 **3.硬核技术**:深入理解高并发系列(全册)、深入理解JVM系列(全册)、深入浅出Java设计模式(全册)、MySQL核心知识(全册)。 **4.技术小册**:深入理解高并发编程(第1版)、深入理解高并发编程(第2版)、从零开始手写RPC框架、SpringCloud Alibaba实战、冰河的渗透实战笔记、MySQL核心知识手册、Spring IOC核心技术、Nginx核心技术、面经手册 **5.技术与就业指导**:提供相关就业辅导和未来发展指引,冰河从初级程序员不断沉淀,成长,突破,一路成长为互联网资深技术专家,相信我的经历和经验对你有所帮助。 冰河的知识星球是一个简单、干净、纯粹交流技术的星球,不吹水,目前领券加入享5折优惠,价值远超门票。加入星球的用户,记得添加冰河微信:**hacker_binghe**,冰河拉你进星球专属VIP交流群。 <div align="center"> <img src="images/xingqiu_149.png?raw=true" width="70%"> <br/> </div> ## 关于本仓库 📚 基于Redis解决缓存击穿、穿透和雪崩问题的通用解决方案,拿来即用。支持存储对象、集合、简单数据类型等。无需提前将数据存入Redis,直接使用提供的分布式缓存接口查询数据即可,附带完善的单元测试用例,方便学习使用。 分布式缓存核心接口源码详见:`io.binghe.redis.cache.distribute.DistributeCacheService` 默认基于Redis的实现类的源码详见:`io.binghe.redis.cache.distribute.redis.RedisDistributeCacheService` 也可以基于SpringBoot的`@ConditionalOnProperty`注解扩展基于其他缓存中间件的实现类 项目配有完善的单元测试用例,具体测试源码详见:`src/test/java`目录下的`io.binghe.redis.test.DistributeCacheServiceTest` ### 核心接口定义 分布式缓存核心接口源码详见:`io.binghe.redis.cache.distribute.DistributeCacheService` ```java /** * @author binghe(微信 : hacker_binghe) * @version 1.0.0 * @description 分布式缓存接口,通用型接口,在满足分布式缓存的需求时,解决了缓存击穿、穿透和雪崩的问题 * @github https://github.com/binghe001 * @copyright 公众号: 冰河技术 */ public interface DistributeCacheService { /** * 永久缓存 * @param key 缓存key * @param value 缓存value */ void set(String key, Object value); /** * 将数据缓存一段时间 * @param key 缓存key * @param value 缓存value * @param timeout 物理缓存的时长 * @param unit 物理时间单位 */ void set(String key, Object value, Long timeout, TimeUnit unit); /** * 保存缓存时设置逻辑过期时间 * @param key 缓存key * @param value 缓存value * @param timeout 缓存逻辑过期时长 * @param unit 缓存逻辑时间单位 */ void setWithLogicalExpire(String key, Object value, Long timeout, TimeUnit unit); /** * 获取缓存中的数据 * @param key 缓存key * @return 缓存value */ String get(String key); /** * 带参数查询对象和简单类型数据,防止缓存穿透 * @param keyPrefix 缓存key的前缀 * @param id 缓存的业务标识, * @param type 缓存的实际对象类型 * @param dbFallback 查询数据库的Function函数 * @param timeout 缓存的时长 * @param unit 时间单位 * @return 返回业务数据 * @param <R> 结果泛型 * @param <ID> 查询数据库参数泛型,也是参数泛型类型 */ <R,ID> R queryWithPassThrough(String keyPrefix, ID id, Class<R> type, Function<ID, R> dbFallback, Long timeout, TimeUnit unit); /** * 不带参数查询对象和简单类型数据,防止缓存穿透 * @param keyPrefix key的前缀 * @param type 缓存的实际对象类型 * @param dbFallback 无参数查询数据库数据 * @param timeout 缓存的时长 * @param unit 时间单位 * @return 返回业务数据 * @param <R> 结果泛型 */ <R> R queryWithPassThroughWithoutArgs(String keyPrefix, Class<R> type, Supplier<R> dbFallback, Long timeout, TimeUnit unit); /** * 带参数查询集合数据,防止缓存穿透 * @param keyPrefix 缓存key的前缀 * @param id 缓存的业务标识, * @param type 缓存的实际对象类型 * @param dbFallback 查询数据库的Function函数 * @param timeout 缓存的时长 * @param unit 时间单位 * @return 返回业务数据 * @param <R> 结果泛型 * @param <ID> 查询数据库参数泛型,也是参数泛型类型 */ <R,ID> List<R> queryWithPassThroughList(String keyPrefix, ID id, Class<R> type, Function<ID, List<R>> dbFallback, Long timeout, TimeUnit unit); /** * 不带参数查询集合数据,防止缓存穿透 * @param keyPrefix 缓存key的前缀 * @param type 缓存的实际对象类型 * @param dbFallback 无参数查询数据库数据 * @param timeout 缓存的时长 * @param unit 时间单位 * @return 返回业务数据 * @param <R> 结果泛型 */ <R> List<R> queryWithPassThroughListWithoutArgs(String keyPrefix, Class<R> type, Supplier<List<R>> dbFallback, Long timeout, TimeUnit unit); /** * 带参数查询数据,按照逻辑过期时间读取缓存数据,新开线程重建缓存,其他线程直接返回逻辑过期数据,不占用资源 * @param keyPrefix 缓存key的前缀 * @param id 缓存业务标识,也是查询数据库的参数 * @param type 缓存的实际对象类型 * @param dbFallback 查询数据库的Function函数 * @param timeout 缓存逻辑过期时长 * @param unit 缓存逻辑过期时间单位 * @return 业务数据 * @param <R> 结果数据泛型类型 * @param <ID> 查询数据库泛型类型,也是参数泛型类型 */ <R, ID> R queryWithLogicalExpire(String keyPrefix, ID id, Class<R> type, Function<ID, R> dbFallback, Long timeout, TimeUnit unit); /** * 不带参数查询数据,按照逻辑过期时间读取缓存数据,新开线程重建缓存,其他线程直接返回逻辑过期数据,不占用资源 * @param keyPrefix 缓存key的前缀 * @param type 缓存的实际对象类型 * @param dbFallback 无参数查询数据库数据 * @param timeout 缓存的时长 * @param unit 时间单位 * @return 返回业务数据 * @param <R> 结果泛型 */ <R> R queryWithLogicalExpireWithoutArgs(String keyPrefix, Class<R> type, Supplier<R> dbFallback, Long timeout, TimeUnit unit); /** * 带参数查询集合数据,按照逻辑过期时间读取缓存数据,新开线程重建缓存,其他线程直接返回逻辑过期数据,不占用资源 * @param keyPrefix 缓存key的前缀 * @param id 缓存业务标识,也是查询数据库的参数 * @param type 缓存的实际对象类型 * @param dbFallback 查询数据库的Function函数 * @param timeout 缓存逻辑过期时长 * @param unit 缓存逻辑过期时间单位 * @return 业务数据 * @param <R> 结果数据泛型类型 * @param <ID> 查询数据库泛型类型,也是参数泛型类型 */ <R, ID> List<R> queryWithLogicalExpireList(String keyPrefix, ID id, Class<R> type, Function<ID, List<R>> dbFallback, Long timeout, TimeUnit unit); /** * 不带参数查询集合数据,按照逻辑过期时间读取缓存数据,新开线程重建缓存,其他线程直接返回逻辑过期数据,不占用资源 * @param keyPrefix 缓存key的前缀 * @param type 缓存的实际对象类型 * @param dbFallback 无参数查询数据库数据 * @param timeout 缓存的时长 * @param unit 时间单位 * @return 返回业务数据 * @param <R> 结果泛型 */ <R> List<R> queryWithLogicalExpireListWithoutArgs(String keyPrefix, Class<R> type, Supplier<List<R>> dbFallback, Long timeout, TimeUnit unit); /** * 带参数查询数据,按照互斥锁方式获取缓存数据,同一时刻只有一个线程访问数据库,其他线程访问不到数据重试 * @param keyPrefix 缓存key的前缀 * @param id 缓存业务标识,也是查询数据库的参数 * @param type 缓存的实际对象类型 * @param dbFallback 查询数据库的Function函数 * @param timeout 缓存时长 * @param unit 时间单位 * @return 业务数据 * @param <R> 结果数据泛型类型 * @param <ID> 查询数据库泛型类型,也是参数泛型类型 */ <R, ID> R queryWithMutex(String keyPrefix, ID id, Class<R> type, Function<ID, R> dbFallback, Long timeout, TimeUnit unit); /** * 不带参数查询数据,按照互斥锁方式获取缓存数据,同一时刻只有一个线程访问数据库,其他线程访问不到数据重试 * @param keyPrefix 缓存key的前缀 * @param type 缓存的实际对象类型 * @param dbFallback 无参数查询数据库数据 * @param timeout 缓存时长 * @param unit 时间单位 * @return 返回业务数据 * @param <R> 结果泛型 */ <R> R queryWithMutexWithoutArgs(String keyPrefix, Class<R> type, Supplier<R> dbFallback, Long timeout, TimeUnit unit); /** * 带参数查询数据,按照互斥锁方式获取缓存数据,同一时刻只有一个线程访问数据库,其他线程访问不到数据重试 * @param keyPrefix 缓存key的前缀 * @param id 缓存业务标识,也是查询数据库的参数 * @param type 缓存的实际对象类型 * @param dbFallback 查询数据库的Function函数 * @param timeout 缓存时长 * @param unit 时间单位 * @return 业务数据 * @param <R> 结果数据泛型类型 * @param <ID> 查询数据库泛型类型,也是参数泛型类型 */ <R, ID> List<R> queryWithMutexList(String keyPrefix, ID id, Class<R> type, Function<ID, List<R>> dbFallback, Long timeout, TimeUnit unit); /** * 不带参数查询数据,按照互斥锁方式获取缓存数据,同一时刻只有一个线程访问数据库,其他线程访问不到数据重试 * @param keyPrefix 缓存key的前缀 * @param type 缓存的实际对象类型 * @param dbFallback 无参数查询数据库数据 * @param timeout 缓存时长 * @param unit 时间单位 * @return 返回业务数据 * @param <R> 结果泛型 */ <R> List<R> queryWithMutexListWithoutArgs(String keyPrefix, Class<R> type, Supplier<List<R>> dbFallback, Long timeout, TimeUnit unit); /** * 将对象类型的json字符串转换成泛型类型 * @param obj 未知类型对象 * @param type 泛型Class类型 * @return 泛型对象 * @param <R> 泛型 */ default <R> R getResult(Object obj, Class<R> type){ if (obj == null){ return null; } //简单类型 if (TypeConversion.isSimpleType(obj)){ return Convert.convert(type, obj); } return JSONUtil.toBean(JSONUtil.toJsonStr(obj), type); } /** * 将对象类型的json字符串转换成泛型类型的List集合 * @param str json字符串 * @param type 泛型Class类型 * @return 泛型List集合 * @param <R> 泛型 */ default <R> List<R> getResultList(String str, Class<R> type){ if (StrUtil.isEmpty(str)){ return null; } return JSONUtil.toList(JSONUtil.parseArray(str), type); } /** * 获取简单的key * @param key key * @return 返回key */ default String getKey(String key){ return getKey(key, null); } /** * 不确定参数类型的情况下,使用MD5计算参数的拼接到Redis中的唯一Key * @param keyPrefix 缓存key的前缀 * @param id 泛型参数 * @return 拼接好的缓存key * @param <ID> 参数泛型类型 */ default <ID> String getKey(String keyPrefix, ID id){ if (id == null){ return keyPrefix; } String key = ""; //简单数据类型与简单字符串 if (TypeConversion.isSimpleType(id)){ key = StrUtil.toString(id); }else { key = MD5.create().digestHex(JSONUtil.toJsonStr(id)); } if (StrUtil.isEmpty(key)){ key = ""; } return keyPrefix.concat(key); } /** * 获取要保存到缓存中的value字符串,可能是简单类型,也可能是对象类型,也可能是集合数组等 * @param value 要保存的value值 * @return 处理好的字符串 */ default String getValue(Object value){ return TypeConversion.isSimpleType(value) ? String.valueOf(value) : JSONUtil.toJsonStr(value); } } ``` ### 使用案例(测试用例) **1.User类模拟从数据库查询对象类型的数据** 源码详见:io.binghe.redis.test.bean.User ```java public class User { private Long id; private String name; public User() { } public User(Long id, String name) { this.id = id; this.name = name; } public Long getId() { return id; } public void setId(Long id) { this.id = id; } public String getName() { return name; } public void setName(String name) { this.name = name; } @Override public String toString() { return JSONUtil.toJsonStr(this); } } ``` **2.DistributeCacheServiceTest类测试各种场景** 源码详见:io.binghe.redis.test.DistributeCacheServiceTest ```java @SpringBootTest @RunWith(SpringRunner.class) public class DistributeCacheServiceTest { @Autowired private DistributeCacheService distributeCacheService; @Test public void testQueryWithPassThrough(){ User user = distributeCacheService.queryWithPassThrough("pass:through:", 1002852L, User.class, this::getUser, 60L, TimeUnit.SECONDS); System.out.println(JSONUtil.toJsonStr(user)); } @Test public void testQueryWithPassThroughWithoutArgs(){ User user = distributeCacheService.queryWithPassThroughWithoutArgs("pass:through001:", User.class, this::getUserWithoutArgs, 60L, TimeUnit.SECONDS); System.out.println(JSONUtil.toJsonStr(user)); } @Test public void testQuerySimpleDataWithPassThrough(){ Integer id = distributeCacheService.queryWithPassThrough("pass:through2:", 100285210, Integer.class, this::getId, 60L, TimeUnit.SECONDS); System.out.println(id); } @Test public void testQuerySimpleDataWithPassThroughWithoutArgs(){ Integer id = distributeCacheService.queryWithPassThroughWithoutArgs("pass:through2002:", Integer.class, this::getIdWithoutArgs, 60L, TimeUnit.SECONDS); System.out.println(id); } @Test public void testQueryWithPassThroughList(){ List<User> list = distributeCacheService.queryWithPassThroughList("pass:through:list:", null, User.class, this::getUserList, 60L, TimeUnit.SECONDS); System.out.println(JSONUtil.toJsonStr(list)); } @Test public void testQueryWithPassThroughListWithoutArgs(){ List<User> list = distributeCacheService.queryWithPassThroughListWithoutArgs("pass:through:list003:", User.class, this::getUserListWithoutArgs, 60L, TimeUnit.SECONDS); System.out.println(JSONUtil.toJsonStr(list)); } @Test public void testQuerySimpleDataWithPassThroughList(){ List<Integer> list = distributeCacheService.queryWithPassThroughList("pass:through:list2:", 100285211, Integer.class, this::getIds, 60L, TimeUnit.SECONDS); System.out.println(JSONUtil.toJsonStr(list)); } @Test public void testQuerySimpleDataWithPassThroughListWithoutArgs(){ List<Integer> list = distributeCacheService.queryWithPassThroughListWithoutArgs("pass:through:list2004:", Integer.class, this::getIdsWithoutArgs, 60L, TimeUnit.SECONDS); System.out.println(JSONUtil.toJsonStr(list)); } @Test public void testQueryWithLogicalExpire(){ User user = distributeCacheService.queryWithLogicalExpire("logical:expire:", 1002852L, User.class, this::getUser, 60L, TimeUnit.SECONDS); System.out.println(JSONUtil.toJsonStr(user)); } @Test public void testQueryWithLogicalExpireWithoutArgs(){ User user = distributeCacheService.queryWithLogicalExpireWithoutArgs("logical:expire005:", User.class, this::getUserWithoutArgs, 60L, TimeUnit.SECONDS); System.out.println(JSONUtil.toJsonStr(user)); } @Test public void testQuerySimpleDataWithLogicalExpire(){ Integer id = distributeCacheService.queryWithLogicalExpire("logical:expire2:", 100285212, Integer.class, this::getId, 60L, TimeUnit.SECONDS); System.out.println(id); } @Test public void testQuerySimpleDataWithLogicalExpireWithoutArgs(){ Integer id = distributeCacheService.queryWithLogicalExpireWithoutArgs("logical:expire2006:", Integer.class, this::getIdWithoutArgs, 60L, TimeUnit.SECONDS); System.out.println(id); } @Test public void testQueryWithLogicalExpireList(){ List<User> list = distributeCacheService.queryWithLogicalExpireList("logical:expire:list:", null, User.class, this::getUserList, 60L, TimeUnit.SECONDS); System.out.println(JSONUtil.toJsonStr(list)); } @Test public void testQueryWithLogicalExpireListWithoutArgs(){ List<User> list = distributeCacheService.queryWithLogicalExpireListWithoutArgs("logical:expire:list007:", User.class, this::getUserListWithoutArgs, 60L, TimeUnit.SECONDS); System.out.println(JSONUtil.toJsonStr(list)); } @Test public void testQuerySimpleDataWithLogicalExpireList(){ List<Integer> list = distributeCacheService.queryWithLogicalExpireList("logical:expire:list2:", 100285213, Integer.class, this::getIds, 60L, TimeUnit.SECONDS); System.out.println(JSONUtil.toJsonStr(list)); } @Test public void testQuerySimpleDataWithLogicalExpireListWithoutArgs(){ List<Integer> list = distributeCacheService.queryWithLogicalExpireListWithoutArgs("logical:expire:list2008:", Integer.class, this::getIdsWithoutArgs, 60L, TimeUnit.SECONDS); System.out.println(JSONUtil.toJsonStr(list)); } @Test public void testQueryWithMutex(){ User user = distributeCacheService.queryWithMutex("mutex:", 1002852L, User.class, this::getUser, 60L, TimeUnit.SECONDS); System.out.println(JSONUtil.toJsonStr(user)); } @Test public void testQueryWithMutexWithoutArgs(){ User user = distributeCacheService.queryWithMutexWithoutArgs("mutex009:", User.class, this::getUserWithoutArgs, 60L, TimeUnit.SECONDS); System.out.println(JSONUtil.toJsonStr(user)); } @Test public void testQuerySimpleDataWithMutex(){ Integer id = distributeCacheService.queryWithMutex("mutex2:", 100285214, Integer.class, this::getId, 60L, TimeUnit.SECONDS); System.out.println(id); } @Test public void testQuerySimpleDataWithMutexWithoutArgs(){ Integer id = distributeCacheService.queryWithMutexWithoutArgs("mutex2010:", Integer.class, this::getIdWithoutArgs, 60L, TimeUnit.SECONDS); System.out.println(id); } @Test public void testQueryWithMutexList(){ List<User> list = distributeCacheService.queryWithMutexList("mutex:list:", null, User.class, this::getUserList, 60L, TimeUnit.SECONDS); System.out.println(JSONUtil.toJsonStr(list)); } @Test public void testQueryWithMutexListWithoutArgs(){ List<User> list = distributeCacheService.queryWithMutexListWithoutArgs("mutex:list011:", User.class, this::getUserListWithoutArgs, 60L, TimeUnit.SECONDS); System.out.println(JSONUtil.toJsonStr(list)); } @Test public void testQuerySimpleDataWithMutexList(){ List<Integer> list = distributeCacheService.queryWithMutexList("mutex:list2:", 123, Integer.class, this::getIds, 60L, TimeUnit.SECONDS); System.out.println(JSONUtil.toJsonStr(list)); } @Test public void testQuerySimpleDataWithMutexListWithoutArgs(){ List<Integer> list = distributeCacheService.queryWithMutexListWithoutArgs("mutex:list2012:", Integer.class, this::getIdsWithoutArgs, 60L, TimeUnit.SECONDS); System.out.println(JSONUtil.toJsonStr(list)); } /** * 模拟带参数从数据库查询对象 */ public User getUser(Long id){ return new User(id, "binghe"); } /** * 默认不带参数从数据库查询对象 */ public User getUserWithoutArgs(){ return new User(1L, "binghe"); } /** * 模拟带参数查询从数据库对象列表 */ public List<User> getUserList(String type){ return Arrays.asList( new User(1L, "binghe001"), new User(2L, "binghe002"), new User(3L, "binghe003") ); } /** * 模拟不带参数从数据库查询对象列表 */ public List<User> getUserListWithoutArgs(){ return Arrays.asList( new User(1L, "binghe001"), new User(2L, "binghe002"), new User(3L, "binghe003") ); } /** * 模拟带参数从数据库查询简单数据类型数据 */ public Integer getId(Integer id){ return id; } /** * 模拟不带参数从数据库查询简单数据类型数据 */ public Integer getIdWithoutArgs(){ return 0; } /** * 模拟带参数从数据库查询简单数据类型数据列表 */ public List<Integer> getIds(Integer id){ return Arrays.asList(0,0,0); } /** * 模拟不带参数从数据库查询简单数据类型数据列表 */ public List<Integer> getIdsWithoutArgs(){ return Arrays.asList(0,0,0); } } ``` <div align="center"> <a href="https://github.com/binghe001/BingheGuide">关注冰河技术,解锁更多技能!</a> </div> ## 加群交流 本群的宗旨是给大家提供一个良好的技术学习交流平台,所以杜绝一切广告!由于微信群人满 100 之后无法加入,请扫描下方二维码先添加作者 “冰河” 微信(hacker_binghe),备注:`学习加群`。 <div align="center"> <img src="images/hacker_binghe.jpg?raw=true" width="180px"> <div style="font-size: 18px;">冰河微信</div> <br/> </div> ## 公众号 分享各种编程语言、开发技术、分布式与微服务架构、分布式数据库、分布式事务、云原生、大数据与云计算技术和渗透技术。另外,还会分享各种面试题和面试技巧。内容在 **冰河技术** 微信公众号首发,强烈建议大家关注。 <div align="center"> <img src="images/ice_wechat.jpg?raw=true" width="180px"> <div style="font-size: 18px;">公众号:冰河技术</div> <br/> </div> ## 视频号 定期分享各种编程语言、开发技术、分布式与微服务架构、分布式数据库、分布式事务、云原生、大数据与云计算技术和渗透技术。另外,还会分享各种面试题和面试技巧。 <div align="center"> <img src="images/ice_video.png?raw=true" width="180px"> <div style="font-size: 18px;">视频号:冰河技术</div> <br/> </div> ## 星球 加入星球 **[冰河技术](http://m6z.cn/6aeFbs)**,可以获得本站点所有学习内容的指导与帮助。如果你遇到不能独立解决的问题,也可以添加冰河的微信:**hacker_binghe**, 我们一起沟通交流。另外,在星球中不只能学到实用的硬核技术,还能学习**实战项目**! 关注 [冰河技术](images/ice_wechat.jpg)公众号,回复 `星球` 可以获取入场优惠券。 <div align="center"> <img src="images/xingqiu.png?raw=true" width="180px"> <div style="font-size: 18px;">知识星球:冰河技术</div> <br/> </div> ## 冰河整理PDF 关注 **冰河技术** 微信公众号: 回复 “**并发编程2**” 领取《深入理解高并发编程(第2版)》PDF电子书。 回复 “**并发编程**” 领取《深入理解高并发编程(第1版)》PDF电子书。 回复 “**并发源码**” 领取《并发编程核心知识(源码分析篇 第1版)》PDF电子书。 回复 “**并发路线**” 领取并发编程高清学习路线。 回复 “**SA实战**” 领取《SpringCloud Alibaba实战》PDF电子书。 回复 “**渗透笔记**” 领取《冰河的渗透实战笔记》PDF电子书。 回复 “**ngx2**” 获取《Nginx核心技术手册》PDF电子书。 回复 “**我要进大厂**” 领取《我要进大厂系列之面试圣经(第1版)》PDF电子书。 回复 ”**限流**“ 领取《亿级流量下的分布式解决方案》PDF电子书。 回复 “**设计模式**” 领取《深入浅出Java23种设计模式》PDF电子书。 回复 “**Java8新特性**” 领取 《Java8新特性教程》PDF电子书。 回复 “**分布式存储**” 领取《跟冰河学习分布式存储技术》 PDF电子书。 回复 “**Nginx**” 领取《跟冰河学习Nginx技术》PDF电子书。 回复 “**互联网工程**” 领取《跟冰河学习互联网工程技术》PDF电子书。 回复 “**冰河索引**” 领取《冰河技术公号文章索引》PDF电子书。 回复 “**星球**” 获取知识星球优惠券 。 ## 公众号文章同步(善用 Ctrl+F 搜索文章) ### 我的新书 <div align="center"> <img src="images/concurrent-003.jpg?raw=true" width="80%"> <div style="font-size: 18px;"><a href="https://u.jd.com/izMwOkE">《深入理解高并发编程:JDK核心技术》</a></div> <br/> </div> <div align="center"> <img src="images/concurrent-001.jpg?raw=true" width="80%"> <div style="font-size: 18px;"><a href="https://item.jd.com/13190783.html">《深入理解高并发编程:核心原理与案例实战》</a></div> <br/> </div> <div align="center"> <img src="images/concurrent-002.png?raw=true" width="80%"> <div style="font-size: 18px;"><a href="https://item.jd.com/10067507938306.html">《深入高平行開發:深度原理&專案實戰》</a></div> <br/> </div> <div align="center"> <img src="images/transaction.png?raw=true" width="80%"> <div style="font-size: 18px;"><a href="https://item.jd.com/12972343.html">《深入理解分布式事务:原理与实战》</a></div> <br/> </div> <div align="center"> <img src="images/mysql.png?raw=true" width="80%"> <div style="font-size: 18px;"><a href="https://item.jd.com/13036154.html">《MySQL技术大全:开发、优化与运维实战》</a></div> <br/> </div> <div align="center"> <img src="images/bigdata.png?raw=true" width="80%"> <div style="font-size: 18px;"><a href="https://item.jd.com/12710993.html">《海量数据处理与大数据技术实战》</a></div> <br/> </div> ### 书籍介绍 * [《深入理解高并发编程:JDK核心技术》-冰河新书上市](https://mp.weixin.qq.com/s/R4lqlbXpZsFTOUYALFWZUg) * [十年大促亿级流量并发编程总结与规划!](https://mp.weixin.qq.com/s/u5t88cAq1eWuoEzF9qZKMQ) * [听说B站和ChatGPT都被搞宕机了?](https://mp.weixin.qq.com/s/5ipbD7oKEHo5zvBrcWCxNQ) * [沉淀、成长、突破,下一本会是啥?](https://mp.weixin.qq.com/s/CfZHDlfR3g9kQtDyZfa9nw) * [我竟然买不起自己出版的书!!](https://mp.weixin.qq.com/s/sPn11eI1zeZyWwy_TqWHOg) * [多年亿级流量下的高并发经验总结,我毫无保留的写在了这本书中](https://mp.weixin.qq.com/s?__biz=Mzg4MjU0OTM1OA==&mid=2247502342&idx=1&sn=2fed57688d1f747dd1f167648b5f629d&chksm=cf567d07f821f411080fd0a91a8f6b0e93467be15ccecbf1250a2b5c79064265a48d6def0e0c&token=2138570566&lang=zh_CN#rd) * [冰河最新出版的《深入理解高并发编程:核心原理与案例实战》到底讲了些啥?(视频为证)](https://mp.weixin.qq.com/s?__biz=Mzg4MjU0OTM1OA==&mid=2247502370&idx=1&sn=7ca6db495e46a89e707d6390f19eb7b4&chksm=cf567d23f821f4354fe2fa8f6f93502abdac9f3514c1f18ca672ad489c0b402a0b9e1254550c&token=2138570566&lang=zh_CN#rd) * [让大家久等了,冰河的《深入理解分布式事务:原理与实战》终于出版了!!](https://mp.weixin.qq.com/s?__biz=Mzg4MjU0OTM1OA==&mid=2247497263&idx=1&sn=8228a795261b0a90a3f42322368616fc&chksm=cf56412ef821c8389584e9e608aa4d6515cf68e26758b9c578a8f60a796999fafc8686a6a425&token=1170633251&lang=zh_CN#rd) * [冰河获奖啦!](https://mp.weixin.qq.com/s?__biz=Mzg4MjU0OTM1OA==&mid=2247499558&idx=1&sn=c2bf24fd7b6615aeab489c93c65f7fdb&chksm=cf564827f821c1312adffa0b13bf2ffd24be1e378cd46ebaf96d97ddddcb0a22a3339af5058c&token=2138570566&lang=zh_CN#rd) * [没别的,这次全是图!!](https://mp.weixin.qq.com/s?__biz=Mzg4MjU0OTM1OA==&mid=2247497328&idx=1&sn=2220b14a5874b225ee032f2d53e55b92&chksm=cf564171f821c867a6dacd4ca6e6189136eb2de7a9a53a75338c6f4b347d2ad7a7d48a1399d9&token=1170633251&lang=zh_CN#rd) * [预售第一天霸榜No.1](https://mp.weixin.qq.com/s?__biz=Mzg4MjU0OTM1OA==&mid=2247497347&idx=1&sn=acf9fc57ecc5c44b5d28877bfd55c4d3&chksm=cf564182f821c894de668c19592419f2e629f5f200b7a548dc87e7b93f5ed0543d97a0960c4a&token=1170633251&lang=zh_CN#rd) * [分布式事务开山之作草图曝光!!](https://mp.weixin.qq.com/s?__biz=Mzg4MjU0OTM1OA==&mid=2247497427&idx=1&sn=1bc426317ef3d63109b5c6b679b60bc3&chksm=cf5641d2f821c8c425cb3b2f3edd802a16f27ecc31f6881a6180bf927d39c335a27a5e825f9a&token=1170633251&lang=zh_CN#rd) * [周六,签了100本书!!](https://mp.weixin.qq.com/s?__biz=Mzg4MjU0OTM1OA==&mid=2247497449&idx=1&sn=33ede1de6b2f66f21be579720f3419af&chksm=cf5641e8f821c8fe66d3f7d0379953169d4f8d7bfd44615de766796ece591685e9030a16232f&token=1170633251&lang=zh_CN#rd) * [成绩出来了,三项第一!](https://mp.weixin.qq.com/s?__biz=Mzg4MjU0OTM1OA==&mid=2247497530&idx=1&sn=aae63e5c000ff6a310a227ec901eaa2d&chksm=cf56403bf821c92db65e9092a1d2c511e70db7adca1021d7342ffbc02db8afa5b96fb3007f75&token=1170633251&lang=zh_CN#rd) * [这本书爆了,上架不到6天紧急加印!!!!](https://mp.weixin.qq.com/s?__biz=Mzg4MjU0OTM1OA==&mid=2247497556&idx=1&sn=e5346ebb0dde3884137e13c7e68cb319&chksm=cf564055f821c943fced4e7d7729ca1ff9416ea9bab12b6b3e220b78311cb385c2b0e77c5249&token=1170633251&lang=zh_CN#rd) * [又要加印了!!](https://mp.weixin.qq.com/s?__biz=Mzg4MjU0OTM1OA==&mid=2247497872&idx=1&sn=e5f7adb745339e3fa04c3f7174daadc6&chksm=cf564f91f821c68722cf344de013a9650006337f6d711142b17522066782c0041a6f6c9f680c&token=2138570566&lang=zh_CN#rd) * [冰河作客开源中国“高手问答”栏目了!!](https://mp.weixin.qq.com/s?__biz=Mzg4MjU0OTM1OA==&mid=2247497953&idx=1&sn=03b0377302d0ea619519b800390cbe94&chksm=cf564fe0f821c6f6048321e35fa84b26ceb028b83e62cb80e457427e2d2105761afa0e70acd7&token=2138570566&lang=zh_CN#rd) * [我的《海量数据处理与大数据技术实战》出版啦!](https://mp.weixin.qq.com/s?__biz=Mzg4MjU0OTM1OA==&mid=2247489216&idx=1&sn=ba163354bcc2ce902208bd9295065ab1&chksm=cf55a1c1f82228d7d1d1b71939614e30b58fd44b1cdea452e4afad57e53f642af815cd268b0c&token=776659970&lang=zh_CN#rd) * [你敢信?冰河的《海量数据处理与大数据技术实战》免费送!!](https://mp.weixin.qq.com/s?__biz=Mzg4MjU0OTM1OA==&mid=2247489195&idx=1&sn=dd00de3ebd46aca9e318ff513cb23fcb&chksm=cf55a1aaf82228bc3b49ed8b338f0511fb465548d1ddb11489355cdac031cb5ae8b412f42baf&token=776659970&lang=zh_CN#rd) * [《海量数据处理与大数据技术实战》读者福利!!](https://mp.weixin.qq.com/s?__biz=Mzg4MjU0OTM1OA==&mid=2247489115&idx=1&sn=464b0c26af355588db4efda150ca646f&chksm=cf55a15af822284c3e38e542cfb7824d027c947495316ebce26a64abdd27e65a8d10ecce3481&token=776659970&lang=zh_CN#rd) * [两本书,你更喜欢哪一本?](https://mp.weixin.qq.com/s?__biz=Mzg4MjU0OTM1OA==&mid=2247489114&idx=1&sn=f6d3ecc53f7b53e91e050aa6d9e4b9e0&chksm=cf55a15bf822284d4759def938cb06a0da92aadd9768308aedd54bc86e09f9f708b916f957fd&token=776659970&lang=zh_CN#rd) * [冰河又一MySQL技术力作出版(文末送书)!!](https://mp.weixin.qq.com/s?__biz=Mzg4MjU0OTM1OA==&mid=2247489670&idx=1&sn=fe4e1d4f3db05607e37f35cbd7837fc8&chksm=cf55af87f8222691c42b252444a2d5ec1989aadbc0956463169fe6bc8f77c4977b3a4d2dfe9c&token=776659970&lang=zh_CN#rd) #### 星球专栏 * [一个人也许走的很快,但一群人走的更远!](https://mp.weixin.qq.com/s/tpwdPO0qtqwbInppIvgY9g) * [假期做了一项调研:结果合乎情理!](https://mp.weixin.qq.com/s/uoR0fcZ4TOb4PbUviYDMxw) * [五一假期,正式开卷(再送超大福利)](https://mp.weixin.qq.com/s/CcfLI9aH9RSSiK7dWVx4uA) * [冰河生日,限时5折!!](https://mp.weixin.qq.com/s/UlyCSK-tdfZIHRWNXKz2qA) * [自研的RPC零侵入完美整合了SpringCloud Alibaba,很有成就感!](https://mp.weixin.qq.com/s/RAvSWiT3UkYCPJHvIMdFUg) * [《Seckill秒杀系统》开篇:我要手把手教你搭建并研发一套抗瞬时百万流量的秒杀系统](https://mp.weixin.qq.com/s/BCjuU2Dyy685iASeM9xOpg) * [硬核 | 这应该是最强的从零开始手写RPC的专栏](https://mp.weixin.qq.com/s/6tWqrijUq7uAQjjWpmzK7g) * [一周年,太狠了!(文末有福利)](https://mp.weixin.qq.com/s/NF-FEwRwN7taH3J_8Oa3VQ) * [我做了件绝大部分人和机构做不到的事情!!](https://mp.weixin.qq.com/s/_S82AThasfupCCvWogwP6g) * [原来大厂面试官也会在这里偷偷学习!](https://mp.weixin.qq.com/s/Zp0nI2RyFb_UCYpSsUt2OQ) * [转管理被下属怼,如何破局?](https://mp.weixin.qq.com/s/sCuzs3WBrY-71SH08LbUuw) ### 💥 Seckill秒杀系统(星球热更中...) * [《Seckill秒杀系统》开篇:我要手把手教你搭建并研发一套抗瞬时百万流量的秒杀系统](https://articles.zsxq.com/id_52v1wwqt8uez.html) * [《Seckill秒杀系统》第1章:从多个角度聊聊为何要学习秒杀系统](https://articles.zsxq.com/id_568x6zv029p0.html) * [《Seckill秒杀系统》第2章:秒杀系统的目标与挑战](https://articles.zsxq.com/id_o8mpr6wa7ckz.html) * [《Seckill秒杀系统》第3章:秒杀系统高并发大流量的应对之道](https://articles.zsxq.com/id_twifx1gmaqp5.html) * [《Seckill秒杀系统》第4章:秒杀系统需求与流程梳理](https://articles.zsxq.com/id_dl7wxs3uea1x.html) * [《Seckill秒杀系统》第5章:秒杀系统技术流程梳理](https://articles.zsxq.com/id_34je77wpb9ul.html) * [《Seckill秒杀系统》第6章:秒杀系统总体方案目标与架构设计](https://articles.zsxq.com/id_6beq5lgdxv16.html) * [《Seckill秒杀系统》第7章:秒杀系统基础环境搭建](https://articles.zsxq.com/id_k0a488ltsvsc.html) * [《Seckill秒杀系统》第8章:秒杀系统研发环境搭建](https://articles.zsxq.com/id_0icjzih3iia1.html) * [《Seckill秒杀系统》第9章:秒杀系统数据模型设计](https://articles.zsxq.com/id_wienuttsbsl6.html) * [《Seckill秒杀系统》第10章:基于DDD快速搭建秒杀系统项目并测试](https://articles.zsxq.com/id_p5k53dcnsmot.html) * [《Seckill秒杀系统》第11章:用户登录授权认证流程的设计与实现](https://articles.zsxq.com/id_6loz5zwuc8pd.html) * [《Seckill秒杀系统》第12章:访问登录授权限制接口的流程设计与实现](https://articles.zsxq.com/id_hnrl4fo6novx.html) * [《Seckill秒杀系统》第13章:秒杀活动业务流程与接口设计](https://articles.zsxq.com/id_wdfwtvodfnb4.html) * [《Seckill秒杀系统》第14章:秒杀活动后端业务与接口开发](https://articles.zsxq.com/id_20cgmqtahkow.html) * [《Seckill秒杀系统》第15章:秒杀活动运营端业务开发](https://articles.zsxq.com/id_rphbmq1x6ohj.html) * [《Seckill秒杀系统》第16章:秒杀活动用户端业务开发](https://articles.zsxq.com/id_b0r7cbkn1z5d.html) * [《Seckill秒杀系统》第17章:秒杀商品业务流程与接口设计](https://articles.zsxq.com/id_7xeu66680um9.html) * [《Seckill秒杀系统》第18章:秒杀商品后端业务与接口开发](https://articles.zsxq.com/id_oljjuu2nwrgd.html) * [《Seckill秒杀系统》第19章:秒杀商品运营端业务开发](https://articles.zsxq.com/id_w3ii5bkwh5gm.html) * [《Seckill秒杀系统》第20章:秒杀商品用户端业务开发](https://articles.zsxq.com/id_ocqncwn7kji9.html) * [《Seckill秒杀系统》第21章:秒杀订单业务流程与接口设计](https://articles.zsxq.com/id_mmxb7tfkj035.html) * [《Seckill秒杀系统》第22章:秒杀订单后端业务与接口开发](https://articles.zsxq.com/id_z7pvr4qj6azh.html) * [《Seckill秒杀系统》第23章:秒杀订单用户端业务开发](https://articles.zsxq.com/id_r16tqrglj0zd.html) * [《Seckill秒杀系统》第24章:秒杀订单运营端业务开发](https://articles.zsxq.com/id_1tww82zwaj30.html) * [《Seckill秒杀系统》第25章:重现刷单流量问题](https://articles.zsxq.com/id_lpob0g9wq78k.html) * [《Seckill秒杀系统》第26章:重现库存超卖问题](https://articles.zsxq.com/id_iuwm7wxw65z1.html) * [《Seckill秒杀系统》第27章:混合型缓存设计场景与原则](https://articles.zsxq.com/id_vkg89cwmiplq.html) * [《Seckill秒杀系统》第28章:混合型缓存架构设计与技术选型](https://articles.zsxq.com/id_b1k5io028b3w.html) * [《Seckill秒杀系统》第29章:混合型缓存通用代码设计与实现](https://articles.zsxq.com/id_0wkdzau58v0j.html) * [《Seckill秒杀系统》第30章:分布式锁通用代码设计与实现](https://articles.zsxq.com/id_mqv8hetgxwd5.html) * [《Seckill秒杀系统》第31章:混合型缓存通用模型设计与实现](https://articles.zsxq.com/id_x5kditct932j.html) * [《Seckill秒杀系统》第32章:活动列表混合型缓存设计与实现](https://articles.zsxq.com/id_jup74sh3lvfv.html) * [《Seckill秒杀系统》第33章:活动详情混合型缓存设计与实现](https://articles.zsxq.com/id_1rln216kx577.html) * [《Seckill秒杀系统》第34章:商品列表混合型缓存设计与实现](https://articles.zsxq.com/id_3juzlk9zuxch.html) * [《Seckill秒杀系统》第35章:商品详情混合型缓存设计与实现](https://articles.zsxq.com/id_wfbn75le9jdt.html) * [《Seckill秒杀系统》第36章:领域事件通用缓存模型设计](https://articles.zsxq.com/id_rj7wlsf94bjo.html) * [《Seckill秒杀系统》第37章:秒杀活动监听缓存领域事件的设计与实现](https://articles.zsxq.com/id_r91w0tv22iha.html) * [《Seckill秒杀系统》第38章:秒杀活动发送缓存领域事件的设计与实现](https://articles.zsxq.com/id_c6p9gl22z93v.html) * [《Seckill秒杀系统》第39章:秒杀商品监听缓存领域事件的设计与实现](https://articles.zsxq.com/id_wk6nrsmwxla7.html) * [《Seckill秒杀系统》第40章:秒杀商品发送缓存领域事件的设计与实现](https://articles.zsxq.com/id_05mkgmlxe4wa.html) * [《Seckill秒杀系统》第41章:秒杀订单监听领域事件的设计与实现](https://articles.zsxq.com/id_xrk938m6odxi.html) * [《Seckill秒杀系统》第42章:秒杀订单发送领域事件的设计与实现](https://articles.zsxq.com/id_lmgdiuerguce.html) * [《Seckill秒杀系统》第43章:秒杀扣减库存设计](https://articles.zsxq.com/id_ghzd1fwxws1l.html) * [《Seckill秒杀系统》第44章:基于数据库设计并实现库存防超卖](https://articles.zsxq.com/id_26bhpj32qsht.html) * [《Seckill秒杀系统》第45章:基于分布式锁设计并实现库存防超卖](https://articles.zsxq.com/id_ugbgwqmpqf06.html) * [《Seckill秒杀系统》第46章:基于Lua脚本设计并实现库存防超卖](https://articles.zsxq.com/id_488nwittqk50.html) * [《Seckill秒杀系统》第47章:从单体到微服务重构项目](https://articles.zsxq.com/id_3iqbf48v3onj.html) * [其他介绍](https://binghe.gitcode.host/md/zsxq/introduce.html) ### 🔥🔥🔥 RPC手撸专栏 #### RPC框架介绍 * [《RPC手撸专栏》介绍:我一个人开发了“Dubbo”框架](https://mp.weixin.qq.com/s/WbDutwhnzoVhuqq6tYmGHg) * [《RPC手撸专栏》前言:这次我设计了一款TPS百万级别的分布式、高性能、可扩展的RPC框架](https://mp.weixin.qq.com/s/5HOUg49X0xQmkQjMiWnaIg) #### 第一篇:整体设计 * [《RPC手撸专栏》第1章:开篇,从零开始手撸一个能在实际场景使用的高性能RPC框架](https://articles.zsxq.com/id_6gfgwev2uw0p.html) * [《RPC手撸专栏》第2章:高性能分布式RPC框架整体架构设计](https://articles.zsxq.com/id_xvd5up1u16nx.html) * [《RPC手撸专栏》第3章:RPC服务核心注解的设计与实现](https://articles.zsxq.com/id_zr6w6dvgdc95.html) * [《RPC手撸专栏》第4章:实现RPC服务核心注解的扫描与解析](https://articles.zsxq.com/id_bal2cnmw3jbi.html) #### 第二篇:服务提供者 * [《RPC手撸专栏》第5章:服务提供者收发消息基础功能实现](https://articles.zsxq.com/id_df11g9wmm8ad.html) * [《RPC手撸专栏》第6章:自定义网络传输协议的实现](https://articles.zsxq.com/id_qgntsrvlljea.html) * [《RPC手撸专栏》第7章:自定义网络编解码的实现](https://articles.zsxq.com/id_5wqs9pshkwep.html) * [《RPC手撸专栏》第8章:模拟服务消费者与服务提供者之间的数据交互](https://articles.zsxq.com/id_cd0dpx73a4uk.html) * [《RPC手撸专栏》第9章:服务提供者调用真实方法的实现](https://articles.zsxq.com/id_fwxlxhq2bthz.html) * [《RPC手撸专栏》第10章:测试服务提供者调用真实方法](https://articles.zsxq.com/id_1uamc23nb7ws.html) * [《RPC手撸专栏》第11章:服务提供者扩展支持CGLib调用真实方法](https://articles.zsxq.com/id_wjwsso804nip.html) #### 第三篇:服务消费者 * [《RPC手撸专栏》第12章:实现服务消费者与服务提供者直接通信](https://articles.zsxq.com/id_wpfhwa85newn.html) * [《RPC手撸专栏》第13章:服务消费者异步转同步直接获取返回结果](https://articles.zsxq.com/id_dfzr8j7xsn7e.html) * [《RPC手撸专栏》第14章:服务消费者异步转同步的自定义Future与AQS实现](https://articles.zsxq.com/id_6v8wcbaaitg4.html) * [《RPC手撸专栏》第15章:服务消费者同步、异步、单向调用的实现](https://articles.zsxq.com/id_oixsibgwepmu.html) * [《RPC手撸专栏》第16章:服务消费者回调方法的实现](https://articles.zsxq.com/id_7nlqbj3z8rgw.html) * [《RPC手撸专栏》第17章:服务消费者实现动态代理功能屏蔽构建请求协议对象的细节](https://articles.zsxq.com/id_bv0jkh6fhr1w.html) * [《RPC手撸专栏》第18章:服务消费者整合动态代理实现直接调用接口返回结果数据](https://articles.zsxq.com/id_r6u3xt22wf67.html) * [《RPC手撸专栏》第19章:服务消费者动态代理实现异步调用](https://articles.zsxq.com/id_ciz8m2zcl6nt.html) * [《RPC手撸专栏》第20章:服务消费者动态代理扩展优化](https://articles.zsxq.com/id_o9c21x3zm7nb.html) #### 第四篇:注册中心 * [《RPC手撸专栏》第21章:注册中心基础服务功能的实现](https://articles.zsxq.com/id_ngn4qzn5wdrw.html) * [《RPC手撸专栏》第22章:服务提供者整合注册中心实现服务注册](https://articles.zsxq.com/id_mpe8iuzbw3kv.html) * [《RPC手撸专栏》第23章:服务消费者整合注册中心实现服务发现](https://articles.zsxq.com/id_olkz5tacb7jg.html) #### 第五篇:负载均衡 * [《RPC手撸专栏》第24章:服务消费者实现基于随机算法的负载均衡策略](https://articles.zsxq.com/id_dfw9afgegew8.html) #### 第六篇:SPI扩展序列化机制 * [《RPC手撸专栏》第25章:对标Dubbo实现SPI扩展机制的基础功能](https://articles.zsxq.com/id_cvhib8cm8iaf.html) * [《RPC手撸专栏》第26章:基于SPI扩展JDK序列化与反序列化机制](https://articles.zsxq.com/id_4najtmbvm8gk.html) * [《RPC手撸专栏》第27章:基于SPI扩展Json序列化与反序列化机制](https://articles.zsxq.com/id_oapx2ea6lbhl.html) * [《RPC手撸专栏》第28章:基于SPI扩展Hessian2序列化与反序列化机制](https://articles.zsxq.com/id_23d9f8sx8imj.html) * [《RPC手撸专栏》第29章:基于SPI扩展FST序列化与反序列化机制](https://articles.zsxq.com/id_2m5a4wru5irw.html) * [《RPC手撸专栏》第30章:基于SPI扩展Kryo序列化与反序列化机制](https://articles.zsxq.com/id_jhrm1sk1ejdu.html) * [《RPC手撸专栏》第31章:基于SPI扩展Protostuff序列化与反序列化机制](https://articles.zsxq.com/id_oldx0om9zxqc.html) #### 第七篇:SPI扩展动态代理机制 * [《RPC手撸专栏》第32章:基于SPI扩展JDK动态代理机制](https://articles.zsxq.com/id_l74bj0wq86wd.html) * [《RPC手撸专栏》第33章:基于SPI扩展CGLib动态代理机制](https://articles.zsxq.com/id_ncdserhiza68.html) * [《RPC手撸专栏》第34章:基于SPI扩展Javassist动态代理机制](https://articles.zsxq.com/id_i2tz6xldl3hc.html) * [《RPC手撸专栏》第35章:基于SPI扩展ByteBuddy动态代理机制](https://articles.zsxq.com/id_sth5wav0oicw.html) * [《RPC手撸专栏》第36章:基于SPI扩展ASM动态代理机制](https://articles.zsxq.com/id_5he0lwawarlr.html) #### 第八篇:SPI扩展反射机制 * [《RPC手撸专栏》第37章:基于SPI扩展JDK反射机制调用真实方法](https://articles.zsxq.com/id_87wv5jfij31u.html) * [《RPC手撸专栏》第38章:基于SPI扩展CGLib反射机制调用真实方法](https://articles.zsxq.com/id_lp85axls7tlj.html) * [《RPC手撸专栏》第39章:基于SPI扩展Javassist反射机制调用真实方法](https://articles.zsxq.com/id_wgicowxzrwal.html) * [《RPC手撸专栏》第40章:基于SPI扩展ByteBuddy反射机制调用真实方法](https://articles.zsxq.com/id_0t7len2qpez1.html) * [《RPC手撸专栏》第41章:基于SPI扩展ASM反射机制调用真实方法](https://articles.zsxq.com/id_fulc1mu78zdf.html) #### 第九篇:SPI扩展负载均衡策略 * [《RPC手撸专栏》第42章:基于SPI扩展随机算法负载均衡策略](https://articles.zsxq.com/id_wngusxgpn96g.html) * [《RPC手撸专栏》第43章:基于SPI扩展加权随机算法负载均衡策略](https://articles.zsxq.com/id_ejhf3a6a8hfi.html) * [《RPC手撸专栏》第44章:基于SPI扩展轮询算法负载均衡策略](https://articles.zsxq.com/id_nxu39tr84qpm.html) * [《RPC手撸专栏》第45章:基于SPI扩展加权轮询算法负载均衡策略](https://articles.zsxq.com/id_7jfkz4bfloaw.html) * [《RPC手撸专栏》第46章:基于SPI扩展Hash算法负载均衡策略](https://articles.zsxq.com/id_g7vt9ech3vx7.html) * [《RPC手撸专栏》第47章:基于SPI扩展加权Hash算法负载均衡策略](https://articles.zsxq.com/id_jbuwvoiku5bi.html) * [《RPC手撸专栏》第48章:基于SPI扩展源IP地址Hash算法负载均衡策略](https://articles.zsxq.com/id_zaj3965d7au1.html) * [《RPC手撸专栏》第49章:基于SPI扩展源IP地址加权Hash算法负载均衡策略](https://articles.zsxq.com/id_kebef31gh2w5.html) * [《RPC手撸专栏》第50章:基于SPI扩展Zookeeper的一致性Hash算法负载均衡策略](https://articles.zsxq.com/id_9nv3pftx71zi.html) #### 第十篇:SPI扩展增强型负载均衡策略 * [《RPC手撸专栏》第51章:基于SPI扩展增强型加权随机算法负载均衡策略](https://articles.zsxq.com/id_mjigpus5d9pc.html) * [《RPC手撸专栏》第52章:基于SPI扩展增强型加权轮询算法负载均衡策略](https://articles.zsxq.com/id_fw9wlglk7vko.html) * [《RPC手撸专栏》第53章:基于SPI扩展增强型加权Hash算法负载均衡策略](https://articles.zsxq.com/id_d01mp6zox24w.html) * [《RPC手撸专栏》第54章:基于SPI扩展增强型加权源IP地址Hash算法负载均衡策略](https://articles.zsxq.com/id_9cthxw6xau9m.html) * [《RPC手撸专栏》第55章:基于SPI扩展增强型Zookeeper一致性Hash算法负载均衡策略](https://articles.zsxq.com/id_noundg5cxq55.html) * [《RPC手撸专栏》第56章:基于SPI扩展最少连接数负载均衡策略](https://articles.zsxq.com/id_9r6lownjcggi.html) #### 第十一篇:SPI扩展实现注册中心 * [《RPC手撸专栏》第57章:基于SPI扩展实现Zookeeper注册中心](https://articles.zsxq.com/id_xkq21xwlnq48.html) * [《RPC手撸专栏》第57-X章:阶段性作业](https://t.zsxq.com/09l7Ck6i3) #### 第十二篇:心跳机制 * [《RPC手撸专栏》第58章:心跳机制交互数据模型设计](https://articles.zsxq.com/id_pklupeh1lzoh.html) * [《RPC手撸专栏》第59章:心跳机制增强数据模型与协议解析设计](https://articles.zsxq.com/id_dcotswnwhxjh.html) * [《RPC手撸专栏》第60章:服务消费者向服务提供者发送心跳信息并接收心跳响应](https://articles.zsxq.com/id_sb16dguakuq4.html) * [《RPC手撸专栏》第61章:服务消费者心跳间隔时间配置化](https://articles.zsxq.com/id_wc2s7whopxwl.html) * [《RPC手撸专栏》第62章:服务提供者向服务消费者发送心跳消息并接收心跳响应](https://articles.zsxq.com/id_lztns8mdvik6.html) * [《RPC手撸专栏》第63章:服务提供者心跳间隔时间配置化](https://articles.zsxq.com/id_jmqzxfsowetb.html) * [《RPC手撸专栏》第63-X章:阶段性作业](https://t.zsxq.com/099LHH6JQ) #### 第十三篇:增强型心跳机制 * [《RPC手撸专栏》第64章:服务提供者增强型心跳检测机制的实现](https://articles.zsxq.com/id_9xhpnwbaa45v.html) * [《RPC手撸专栏》第65章:服务消费者增强型心跳检测机制的实现](https://articles.zsxq.com/id_ggtdg85pe3fp.html) #### 第十四篇:重试机制 * [《RPC手撸专栏》第66章:服务消费者实现服务订阅的重试机制](https://articles.zsxq.com/id_3ck9r5z89ovn.html) * [《RPC手撸专栏》第67章:服务消费者连接服务提供者的重试机制](https://articles.zsxq.com/id_i5r441hvg5fp.html) #### 第十五篇:整合Spring * [《RPC手撸专栏》第68章:服务提供者整合Spring](https://articles.zsxq.com/id_lpdm92w0kp2l.html) * [《RPC手撸专栏》第69章:基于Spring XML接入服务提供者](https://articles.zsxq.com/id_uah3g65dl7f7.html) * [《RPC手撸专栏》第70章:基于Spring注解接入服务提供者](https://articles.zsxq.com/id_punfttiew3sz.html) * [《RPC手撸专栏》第71章:服务消费者整合Spring](https://articles.zsxq.com/id_q2ag9elmdx0p.html) * [《RPC手撸专栏》第72章:基于Spring XML接入服务消费者](https://articles.zsxq.com/id_ewc703dwui4u.html) * [《RPC手撸专栏》第73章:基于Spring注解接入服务消费者](https://articles.zsxq.com/id_atww2tou7n3c.html) * [《RPC手撸专栏》第73-X章:整合Spring阶段作业](https://t.zsxq.com/09ON0butp) #### 第十六篇:整合SpringBoot * [《RPC手撸专栏》第74章:服务提供者整合SpringBoot](https://articles.zsxq.com/id_a34w101tgs4d.html) * [《RPC手撸专栏》第75章:基于SpringBoot接入服务提供者](https://articles.zsxq.com/id_8fwr1wu3jx9p.html) * [《RPC手撸专栏》第76章:服务消费者整合SpringBoot](https://articles.zsxq.com/id_6urdw6g1zg0h.html) * [《RPC手撸专栏》第77章:基于SpringBoot接入服务消费者](https://articles.zsxq.com/id_9cq199w02eim.html) * [《RPC手撸专栏》第77章-X:整合SpringBoot阶段作业](https://t.zsxq.com/09N4JAv7i) #### 第十七篇:整合Docker * [《RPC手撸专栏》第78章:基于Docker接入服务提供者](https://articles.zsxq.com/id_dfptlavwpapq.html) * [《RPC手撸专栏》第79章:基于Docker接入服务消费者](https://articles.zsxq.com/id_16ff4f19lgv2.html) * [《RPC手撸专栏》第79章-X:整合Docker阶段作业](https://t.zsxq.com/09hixMbmO) #### 第十八篇:整合SpringCloud Alibaba * [《RPC手撸专栏》第80章:整合SpringCloud Alibaba实际项目](https://articles.zsxq.com/id_zxv133s0wffh.html) * [《RPC手撸专栏》第80章-X:整合SpringCloud Alibaba阶段作业](https://t.zsxq.com/09qxlVm8k) #### 第十九篇:结果缓存 * [《RPC手撸专栏》第81章:结果缓存通用模型设计](https://articles.zsxq.com/id_a7q8wlunhtd8.html) * [《RPC手撸专栏》第82章:服务提供者支持结果缓存](https://articles.zsxq.com/id_pb8flsvw5vzn.html) * [《RPC手撸专栏》第83章:服务消费者支持结果缓存](https://articles.zsxq.com/id_7nkgz2zgzpf9.html) * [《RPC手撸专栏》第83章-X:结果缓存阶段作业](https://t.zsxq.com/0aWyqturC) #### 第二十篇:路由控制 * [《RPC手撸专栏》第84章:服务消费者直连某个服务提供者](https://articles.zsxq.com/id_loa4a96us7vr.html) * [《RPC手撸专栏》第85章:服务消费者直连多个服务提供者](https://articles.zsxq.com/id_2kfvhlwnzl76.html) * [《RPC手撸专栏》第85章-X:路由控制阶段作业](https://t.zsxq.com/0aEYXMVTS) #### 第二十一篇:延迟连接 * [《RPC手撸专栏》第86章:服务消费者支持延迟连接服务提供者](https://articles.zsxq.com/id_hj42kkrf85cn.html) * [《RPC手撸专栏》第87章:服务消费者支持非延迟连接服务提供者](https://articles.zsxq.com/id_h3svijixud6p.html) * [《RPC手撸专栏》第87章-X:延迟连接阶段作业](https://t.zsxq.com/0a1MfxLbi) #### 第二十二篇:并发控制 * [《RPC手撸专栏》第88章:并发控制基础模型设计](https://articles.zsxq.com/id_a79di5nhzlk4.html) * [《RPC手撸专栏》第89章:服务提供者支持并发控制](https://articles.zsxq.com/id_w9zdvzuxv4tu.html) * [《RPC手撸专栏》第90章:服务消费者支持并发控制](https://articles.zsxq.com/id_hu84gd1gdzah.html) * [《RPC手撸专栏》第90章-X:并发控制阶段作业](https://t.zsxq.com/0arcudg7B) #### 第二十三篇:流控分析 * [《RPC手撸专栏》第91章:流控分析后置处理器模型设计](https://articles.zsxq.com/id_ndawdn2u64gr.html) * [《RPC手撸专栏》第92章:服务提供者整合流控分析](https://articles.zsxq.com/id_b87137w95lm9.html) * [《RPC手撸专栏》第93章:服务消费者整合流控分析](https://articles.zsxq.com/id_nuww2cx76mb7.html) * [《RPC手撸专栏》第93章-X:流控分析阶段作业](https://t.zsxq.com/0an9ZVkoH) #### 第二十四篇:连接控制 * [《RPC手撸专栏》第94章:连接控制基础模型设计](https://articles.zsxq.com/id_taxtdstwbfdf.html) * [《RPC手撸专栏》第95章:服务提供者整合连接控制](https://articles.zsxq.com/id_4itcb5tpup7b.html) * [《RPC手撸专栏》第95章-X:连接控制阶段作业](https://t.zsxq.com/0asmzlzC6) #### 第二十五篇:SPI扩展连接淘汰策略 * [《RPC手撸专栏》第96章:基于SPI扩展最早连接淘汰策略](https://articles.zsxq.com/id_e7hlzna0gpm9.html) * [《RPC手撸专栏》第97章:基于SPI扩展最晚连接淘汰策略](https://articles.zsxq.com/id_24pp9fvewh4l.html) * [《RPC手撸专栏》第98章:基于SPI扩展先进先出连接淘汰策略](https://articles.zsxq.com/id_bbbwcru9utig.html) * [《RPC手撸专栏》第99章:基于SPI扩展使用次数最少连接淘汰策略](https://articles.zsxq.com/id_3r0wivvc2j7q.html) * [《RPC手撸专栏》第100章:基于SPI扩展最近未被使用连接淘汰策略](https://articles.zsxq.com/id_2d4k9lczhj11.html) * [《RPC手撸专栏》第101章:基于SPI扩展随机连接淘汰策略](https://articles.zsxq.com/id_6zfc5tu4sadw.html) * [《RPC手撸专栏》第102章:基于SPI扩展拒绝连接淘汰策略](https://articles.zsxq.com/id_zxujiru0m4aa.html) * [《RPC手撸专栏》第102章-X:SPI扩展连接拒绝策略阶段作业](https://t.zsxq.com/0aYfqnGxl) #### 第二十六篇:数据缓冲 * [《RPC手撸专栏》第103章:数据缓冲基础模型设计](https://articles.zsxq.com/id_7vlqjihwavcw.html) * [《RPC手撸专栏》第104章:服务提供者整合数据缓冲](https://articles.zsxq.com/id_gf5vsocpwga6.html) * [《RPC手撸专栏》第105章:服务消费者整合数据缓冲](https://articles.zsxq.com/id_b39p80gjmowd.html) * [《RPC手撸专栏》第105章-X:数据缓冲阶段作业](https://t.zsxq.com/0alPcr61Q) #### 第二十七篇:服务容错(降级) * [《RPC手撸专栏》第106章:服务容错设计与研发](https://articles.zsxq.com/id_mtz8cn9qktrr.html) * [《RPC手撸专栏》第107章:服务容错效果测试](https://articles.zsxq.com/id_356s6sxma22g.html) * [《RPC手撸专栏》第108章:服务容错失效问题修复](https://articles.zsxq.com/id_c0mka70xo8ke.html) * [《RPC手撸专栏》第108章-X:服务容错阶段作业](https://t.zsxq.com/0aChrm3ci) #### 第二十八篇:服务限流 * [《RPC手撸专栏》第109章:服务限流基础模型设计](https://articles.zsxq.com/id_vt4qwkw275gr.html) * [《RPC手撸专栏》第110章:服务提供者整合服务限流](https://articles.zsxq.com/id_d6o05rs0bwmw.html) * [《RPC手撸专栏》第111章:服务消费者整合服务限流](https://articles.zsxq.com/id_32dwwi3an8rx.html) * [《RPC手撸专栏》第111章-X:服务限流阶段作业](https://t.zsxq.com/0aTS2J22h) #### 第二十九篇:基于SPI扩展限流策略 * [《RPC手撸专栏》第112章:基于SPI扩展Semaphore限流策略](https://articles.zsxq.com/id_zchmw3ulitfs.html) * [《RPC手撸专栏》第113章:基于SPI扩展Guava限流策略](https://articles.zsxq.com/id_ecztx69ekhfz.html) * [《RPC手撸专栏》第113章-X:基于SPI扩展限流策略阶段作业](https://t.zsxq.com/0a2Fzp3Sw) #### 第三十篇:超出限流规则 * [《RPC手撸专栏》第114章:服务提供者超出限流上限触发的规则](https://articles.zsxq.com/id_2rqhz490onjc.html) * [《RPC手撸专栏》第115章:服务消费者超出限流上限触发的规则](https://articles.zsxq.com/id_rwoaeagi50wl.html) * [《RPC手撸专栏》第115章-X:超出限流规则阶段作业](https://t.zsxq.com/0aB1cpEW3) #### 第三十一篇:服务熔断 * [《RPC手撸专栏》第116章:服务熔断基础模型设计](https://articles.zsxq.com/id_8rrhnee0nv8q.html) * [《RPC手撸专栏》第117章:服务提供者整合服务熔断](https://articles.zsxq.com/id_0pbfjv4ghrgw.html) * [《RPC手撸专栏》第118章:服务消费者整合服务熔断](https://articles.zsxq.com/id_ohtoim0u1pkz.html) * [《RPC手撸专栏》第118章-X:服务熔断阶段作业](https://t.zsxq.com/0bAeufSFe) #### 第三十二篇:基于SPI扩展熔断策略 * [《RPC手撸专栏》第119章:基于SPI扩展错误率熔断策略](https://articles.zsxq.com/id_ozjp7ion2ebj.html) * [《RPC手撸专栏》第119章-X:基于SPI扩展熔断策略阶段作业](https://t.zsxq.com/0bVwOW9GS) #### 第三十三篇:异常监控 * [《RPC手撸专栏》第120章:异常监控后置处理器基础模型设计](https://articles.zsxq.com/id_bwceq9e1sufr.html) * [《RPC手撸专栏》第121章:服务提供者整合异常监控](https://articles.zsxq.com/id_ds7crh46v1uv.html) * [《RPC手撸专栏》第122章:服务消费者整合异常监控](https://articles.zsxq.com/id_opgjgzu8avrg.html) * [《RPC手撸专栏》第122章-X:异常监控阶段作业](https://t.zsxq.com/0b4npCLi6) #### 维护篇:持续维护篇 * [《RPC手撸专栏》第fix-01章:修复服务消费者读取配置优先级的问题](https://articles.zsxq.com/id_iudrc7plepqa.html) * [《RPC手撸专栏》第fix-02章:修复Zookeeper一致性Hash负载均衡泛型类型不匹配的问题](https://articles.zsxq.com/id_l6v50rh41eah.html) * [《RPC手撸专栏》第fix-03章:修复自定义扫描器递归扫描文件标识不起作用的问题](https://articles.zsxq.com/id_jgao5h7atdlp.html) * [《RPC手撸专栏》第fix-04章:修复基于SpringBoot启动服务消费者Netty Group多次连接的问题](https://articles.zsxq.com/id_813tnxx7ir4i.html) * [《RPC手撸专栏》第fix-05章:修复基于计数器的限流策略不起作用的问题](https://articles.zsxq.com/id_kfwu35ttk997.html) * [《RPC手撸专栏》第fix-06章:修复基于SpringBoot启动服务消费者无法同时连接多个服务提供者的问题](https://articles.zsxq.com/id_p6kt5aspbb07.html) * [《RPC手撸专栏》第fix-07章:更新基于Semaphore的限流策略](https://articles.zsxq.com/id_kdi8ot238q16.html) * [《RPC手撸专栏》第fix-08章:优化服务熔断半开启状态的执行逻辑](https://articles.zsxq.com/id_r7rsxfojwkua.html) #### 番外篇 * [《从零开始手写RPC框架》电子书升级啦!](https://mp.weixin.qq.com/s/Z13jGNY94rEewp1HW9yiPQ) * [用过来人的身份告诉你为何大厂都要自研RPC?](https://mp.weixin.qq.com/s/7DkT5hWw8XHqqWV3JkX7pg) ### 🔥🔥🔥 Spring核心技术 * [《Spring IOC核心技术》共27章,19万字,打包发布!!](https://mp.weixin.qq.com/s/5wCnprKkMRx0Pd1gDImKtg) #### 专栏介绍 * [《Spring核心技术》开篇:我要带你一步步调试Spring6.0源码啦!](https://articles.zsxq.com/id_dxl5ojx1vt2j.html) #### 第一篇:IOC容器 * [《Spring核心技术》第1章:深度解析@Configuration注解(注解、案例与时序图部分)](https://articles.zsxq.com/id_jgn1d90d9nqo.html) * [《Spring核心技术》第1章:深度解析@Configuration注解(源码解析部分)](https://articles.zsxq.com/id_88k0ww2tsr6n.html) * [《Spring核心技术》第2章:深度解析@ComponentScan注解(注解、案例与时序图部分)](https://articles.zsxq.com/id_fvkiczjwem3u.html) * [《Spring核心技术》第2章:深度解析@ComponentScan注解(源码解析部分)](https://articles.zsxq.com/id_w84u4e7fraen.html) * [《Spring核心技术》第3章:深度解析@Bean注解(注解、案例与时序图部分)](https://articles.zsxq.com/id_h6ssgch50ohq.html) * [《Spring核心技术》第3章:深度解析@Bean注解(源码解析部分)](https://articles.zsxq.com/id_vf8ki780f0on.html) * [《Spring核心技术》第3章:深度解析@Bean注解(完善篇)](https://articles.zsxq.com/id_olf2fecx1h6x.html) * [《Spring核心技术》第4章:深度解析从IOC容器中获取Bean的过程](https://articles.zsxq.com/id_ac2wkclvokp3.html) * [《Spring核心技术》第5章:深度解析@Import注解(注解、案例与时序图部分)](https://articles.zsxq.com/id_m85awf52ap7s.html) * [《Spring核心技术》第5章:深度解析@Import注解(源码解析部分)](https://articles.zsxq.com/id_2wqrrl87q4ww.html) * [《Spring核心技术》第6章:深度解析@PropertySource注解](https://articles.zsxq.com/id_b7de867wms62.html) * [《Spring核心技术》第7章:深度解析@DependsOn注解](https://articles.zsxq.com/id_erb3xg2db8s4.html) * [《Spring核心技术》第8章:深度解析@Conditional注解](https://articles.zsxq.com/id_j1615j8bswpu.html) * [《Spring核心技术》第9章:深度解析@Lazy注解](https://articles.zsxq.com/id_umk0hfrr4qf2.html) * [《Spring核心技术》第10章:深度解析@Component注解(含扩展注解)](https://articles.zsxq.com/id_2svfoc2onw0p.html) * [《Spring核心技术》第11章:深度解析@Value注解(含@Autowired和@Inject)](https://articles.zsxq.com/id_1qtfp128b7wc.html) * [《Spring核心技术》第12章:深度解析@Autowired注解](https://articles.zsxq.com/id_xodqwm77k51b.html) * [《Spring核心技术》第13章:深度解析@Qualifier注解](https://articles.zsxq.com/id_tro8zjto40zn.html) * [《Spring核心技术》第14章:深度解析@Resource注解](https://articles.zsxq.com/id_v9wd5237pgu6.html) * [《Spring核心技术》第15章:深度解析@Inject注解](https://articles.zsxq.com/id_2lbs516korwe.html) * [《Spring核心技术》第16章:深度解析@Primary注解](https://articles.zsxq.com/id_ag4v69xzpla1.html) * [《Spring核心技术》第17章:深度解析@Scope注解](https://articles.zsxq.com/id_gbhw3a0m659q.html) * [《Spring核心技术》第18章:深度解析@PostConstruct注解与@PreDestroy注解](https://articles.zsxq.com/id_q1dtx07qi6tx.html) * [《Spring核心技术》第19章:深度解析@Profile注解](https://articles.zsxq.com/id_eproza82fs1w.html) * [《Spring核心技术》第20章:深度解析循环依赖](https://articles.zsxq.com/id_jjiexxsn8j3f.html) * [《Spring核心技术》第21章:深度解析事件监听机制](https://articles.zsxq.com/id_94mft95hpqe5.html) #### 第二篇:AOP切面 * [《Spring核心技术》第22章:AOP切面型注解实战](https://articles.zsxq.com/id_xhaep3mgxbel.html) * [《Spring核心技术》第23章:深度解析@EnableAspectJAutoProxy注解](https://articles.zsxq.com/id_khq22od1cfhz.html) * [《Spring核心技术》第24章:深度解析切入点表达式](https://articles.zsxq.com/id_c8z1ro4n7iqu.html) * [《Spring核心技术》第25章:深度解析构建AOP拦截器链的流程](https://articles.zsxq.com/id_1le7i7cflbec.html) * [《Spring核心技术》第26章:深度解析调用通知方法的流程](https://articles.zsxq.com/id_twxhnxik8enc.html) * [《Spring核心技术》第27章:深度解析@DeclareParents注解](https://articles.zsxq.com/id_rw597583o4jg.html) * [《Spring核心技术》第28章:@EnableLoadTimeWeaving注解](https://articles.zsxq.com/id_hsvnjsdjoci8.html) #### 第三篇:声明式事务 * [《Spring核心技术》第29章:Spring事务概述与编程实战](https://articles.zsxq.com/id_kq76kv1nwzoe.html) * [《Spring核心技术》第30章:深度解析Spring事务三大接口](https://articles.zsxq.com/id_ho2wnitvu0w1.html) * [《Spring核心技术》第31章:深度解析Spring事务隔离级别与传播机制](https://articles.zsxq.com/id_27ujpfwf1mh3.html) * [《Spring核心技术》第32章:深度解析@EnableTransactionManagement注解](https://articles.zsxq.com/id_8oxd67xgmkdk.html) * [《Spring核心技术》第33章:深度解析@Transactional注解](https://articles.zsxq.com/id_ezlz9t1pjks0.html) * [《Spring核心技术》第34章:深度解析Spring事务的执行流程](https://articles.zsxq.com/id_8q0nz7rucvjd.html) * [《Spring核心技术》第35章:深度解析Spring底层事务传播机制源码](https://articles.zsxq.com/id_6sw4tcdnl0qf.html) * [《Spring核心技术》第36章:深度解析@TransactionEventListener注解](https://articles.zsxq.com/id_6mqzv4xx6n58.html) * [《Spring核心技术》第37章:七大场景深度分析Spring事务嵌套最佳实践](https://articles.zsxq.com/id_je5i1cblcqkn.html) * [《Spring核心技术》第38章:深度解析Spring事务失效的八大场景](https://articles.zsxq.com/id_z55u4dijij8c.html) #### 第四篇:AOT预编译 * [《Spring核心技术》第39章:AOT预编译技术概述](https://articles.zsxq.com/id_p5c7maejbow8.html) * [《Spring核心技术》第40章:构建Native Image](https://articles.zsxq.com/id_rh08wot03ls8.html) * [《Spring核心技术》第41章:Maven构建Native Image](https://articles.zsxq.com/id_ssa05mm62s0r.html) #### 第五篇:SpringMVC * [《Spring核心技术》第42章:注解型SpringMVC通用SpringBoot启动模型设计与实现](https://articles.zsxq.com/id_rt2w9t6inv5j.html) * [《Spring核心技术》第43章:深度解析@Controller注解](https://articles.zsxq.com/id_2tkel05tilvq.html) * [《Spring核心技术》第44章:深度解析@RestController注解](https://articles.zsxq.com/id_wrrk6j19tov0.html) * [《Spring核心技术》第45章:深度解析@RequestMapping注解](https://articles.zsxq.com/id_lr7351z398mb.html) * [《Spring核心技术》第46章:深度解析@RequestParam注解](https://articles.zsxq.com/id_uvbwbxp5pwgw.html) * [《Spring核心技术》第47章:深度解析@PathVariable注解](https://articles.zsxq.com/id_27gccis9me0v.html) * [《Spring核心技术》第48章:深度解析@RequestBody注解](https://articles.zsxq.com/id_wzt9ndj1t2o2.html) * [《Spring核心技术》第49章:深度解析@RequestHeader注解](https://articles.zsxq.com/id_vd0nabjnlo9x.html) * [《Spring核心技术》第50章:深度解析@CookieValue注解](https://articles.zsxq.com/id_aphnzhoz3wd3.html) * [《Spring核心技术》第51章:深度解析@ModelAttribute注解](https://articles.zsxq.com/id_6zpnxo0j94za.html) * [《Spring核心技术》第52章:深度解析@ExceptionHandler注解](https://articles.zsxq.com/id_0zdvz0bk7zgn.html) * [《Spring核心技术》第53章:深度解析@InitBinder注解](https://articles.zsxq.com/id_qcwj4wrgz6vo.html) * [《Spring核心技术》第54章:深度解析@ControllerAdvice注解](https://articles.zsxq.com/id_l1q55mu4o9ou.html) * [《Spring核心技术》第55章:深度解析@RequestAttribute注解](https://articles.zsxq.com/id_8rw7tq1wuzom.html) * [《Spring核心技术》第56章:深度解析@SessionAttribute注解](https://articles.zsxq.com/id_m3ssh7jnnpcg.html) * [《Spring核心技术》第57章:深度解析@SessionAttributes注解](https://articles.zsxq.com/id_wnv2f8smr4c6.html) * [《Spring核心技术》第58章:深度解析@ResponseBody注解](https://articles.zsxq.com/id_x5e4eh58lfrv.html) * [《Spring核心技术》第59章:深度解析@CrossOrigin注解](https://articles.zsxq.com/id_iehog588vfef.html) #### 作业篇 * [《Spring核心技术》作业:专栏整体作业](https://t.zsxq.com/0cK9NLELB) ### 🔥🔥🔥 精通高并发系列 * [为何要学习Seckill分布式秒杀系统?它能为你带来什么?](https://mp.weixin.qq.com/s/H4l26h3qRV-05XiNZtQ7OA) * [秒杀系统瞬时百万并发流量的六种应对之道](https://mp.weixin.qq.com/s/Mj1DRFOmzThRV7CW6jWugg) * [🔥🔥🔥 《深入理解高并发编程(第2版)》八大篇章,共433页,打包发布!!](https://mp.weixin.qq.com/s/E5XZW_HSqKBbgxix6xQGAQ) * [深入理解CPU缓存一致性协议MESI(建议收藏)](https://mp.weixin.qq.com/s?__biz=Mzg4MjU0OTM1OA==&mid=2247499147&idx=1&sn=757b83308cef7248a0d4549dd1ffa9a7&chksm=cf564a8af821c39c28c911e08b9367979a4cd868c2be64f93ef3571936b614416716f3ad6d9b&token=2138570566&lang=zh_CN#rd) * [面试高并发,凉了!!(全程高能,建议收藏)](https://mp.weixin.qq.com/s?__biz=Mzg4MjU0OTM1OA==&mid=2247499600&idx=1&sn=cdb4b375676fc71357cbca6a30d9b93d&chksm=cf564851f821c147617db702fe4af386fe1e2acea1caa318ba33a3dd3878f538ae4ce74d3ea4&token=2138570566&lang=zh_CN#rd) * [放假了再撸一篇:StampedLock(建议收藏)](https://mp.weixin.qq.com/s?__biz=Mzg4MjU0OTM1OA==&mid=2247499371&idx=1&sn=4651483f83675642657c0bfc71cfdc5c&chksm=cf56496af821c07cbc5acf16ee24e36da72f73554916a5b01427e0b2d36e728e19e885b3a94c&token=2138570566&lang=zh_CN#rd) * [一个Callable接口能有多少知识点?](https://mp.weixin.qq.com/s?__biz=Mzg4MjU0OTM1OA==&mid=2247499203&idx=1&sn=ec3fed322fbb7497563921d1a31924b8&chksm=cf564ac2f821c3d467b6efa428c2b67885e466e13b67fd8c363cbb3ab0ae0edbba86c18b269e&token=2138570566&lang=zh_CN#rd) * [10分钟手撸Java线程池,yyds!!](https://mp.weixin.qq.com/s?__biz=Mzg4MjU0OTM1OA==&mid=2247497033&idx=1&sn=2d4dfe36a6799e8484915d0faf86c7d5&chksm=cf564248f821cb5e88fd823c1203a534fe3f9dff4b81cd29cebf5c43fc0619a765716bc7a093&token=1170633251&lang=zh_CN#rd) * [不得不说的几种限流算法](https://mp.weixin.qq.com/s?__biz=Mzg4MjU0OTM1OA==&mid=2247496873&idx=2&sn=be0a6919734f01d042c63a12cf692d03&chksm=cf5643a8f821cabeae2c37ab468552b9a104f3e0ed5f81ffb862faa1c6da982754eae2c0ac6d&token=1170633251&lang=zh_CN#rd) * [一文扫尽Java中的并发原子类,yyds!!](https://mp.weixin.qq.com/s?__biz=Mzg4MjU0OTM1OA==&mid=2247496611&idx=1&sn=12885d386beeda9ebaa7192eeb74de07&chksm=cf5644a2f821cdb4b458a43e26c8f3af2c1f3d274d82ecc53ca9e0873f36cd12022a3c179a42&token=627091476&lang=zh_CN#rd) * [高并发秒杀系统如何实现正确的扣减库存?](https://mp.weixin.qq.com/s?__biz=Mzg4MjU0OTM1OA==&mid=2247495768&idx=1&sn=7a9eadbfc54600b26fa1bcb00bca5b52&chksm=cf564759f821ce4fb234ced2deb2c2bee1c4ed3b78115545ae481fd5314d6eb1d865df9c6cc8&token=546222828&lang=zh_CN#rd) * [生产环境内存溢出了!!](https://mp.weixin.qq.com/s?__biz=Mzg4MjU0OTM1OA==&mid=2247495757&idx=1&sn=66ad5e42ee5bfb4e60ae257c1296f261&chksm=cf56474cf821ce5ada36c785e7086d341add1e3333330268c7e3ae8cc1dfdc5758d3c86930f9&token=546222828&lang=zh_CN#rd) * [实践出真知:全网最强秒杀系统架构解密!!](https://mp.weixin.qq.com/s?__biz=Mzg4MjU0OTM1OA==&mid=2247495561&idx=1&sn=84324d48463ebcb34aa3b095a6f3af87&chksm=cf565888f821d19eb98b83793f476c800287b7df2bae4f1f68e6af32c7adff38ad31de1df282&token=546222828&lang=zh_CN#rd) * [并发编程踩坑实录二:并发容器踩坑总结!!](https://mp.weixin.qq.com/s?__biz=Mzg4MjU0OTM1OA==&mid=2247495501&idx=1&sn=1140ff616ac5d247f9a3c100c3a82322&chksm=cf56584cf821d15a78f6d5a05d3085991b8fdd2a032c89944ffb4d7e7d06ad3d6ea0f09b3ffd&token=546222828&lang=zh_CN#rd) * [并发编程踩坑实录一:盘点JDK中同步容器的两大坑!!](https://mp.weixin.qq.com/s?__biz=Mzg4MjU0OTM1OA==&mid=2247495411&idx=1&sn=5826041180fbb0a2fb938b1cdb4fcc43&chksm=cf5659f2f821d0e4eed130c28aa757058699a9e2f92a6bbc67fc5688de362edbc50bdd396d13&token=1081687632&lang=zh_CN#rd) * [系统性能优化的关键指标(yyds)!!!](https://mp.weixin.qq.com/s?__biz=Mzg4MjU0OTM1OA==&mid=2247495332&idx=1&sn=6fcaf156fae6419a03ede479a30eede4&chksm=cf5659a5f821d0b37d8067bf868fb60780e5794e4220312e55c5f5e9bd3994c72b724f2f379d&token=1081687632&lang=zh_CN#rd) * [我用10张图总结出这份并发编程最佳学习路线!!](https://mp.weixin.qq.com/s?__biz=Mzg4MjU0OTM1OA==&mid=2247495270&idx=1&sn=9c5a94cbdfd8115b45086d8bc3f06668&chksm=cf565967f821d071a83e68c22c391da8aca09ae5621af2933311d3986ea179b7642a955ba0d5&token=1081687632&lang=zh_CN#rd) * [进一步优化:性能提升了200%!](https://mp.weixin.qq.com/s?__biz=Mzg4MjU0OTM1OA==&mid=2247495055&idx=1&sn=8c1f14322c2f9327cea3dfafbfde855b&chksm=cf565a8ef821d398fc22ee97cbe8e29002bda11359fa122fe6538d2a5478d1bd203d8123fcc6&token=1081687632&lang=zh_CN#rd) * [性能提升了200%!(优化篇)](https://mp.weixin.qq.com/s?__biz=Mzg4MjU0OTM1OA==&mid=2247494832&idx=1&sn=a2ce25f76cfaaf3c26354af226fd26c3&chksm=cf565bb1f821d2a72dad4b9bf49a6f1b449794bdefde0fc9de5b78a4df6b9d743e347fe9993e&token=1081687632&lang=zh_CN#rd) * [StampedLock,一种比读写锁更快的锁!](https://mp.weixin.qq.com/s?__biz=Mzg4MjU0OTM1OA==&mid=2247494645&idx=1&sn=e56f8780f857415464618b5b8546d28b&chksm=cf565cf4f821d5e21e1d91c0096680e2a6a437d51454298fd1c7a6c0d650774e8c89fda544cd&token=1772864567&lang=zh_CN#rd) * [除了技术,互联网大厂更看重这些基础能力!!(建议收藏)](https://mp.weixin.qq.com/s?__biz=Mzg4MjU0OTM1OA==&mid=2247493259&idx=1&sn=2e17273ac15d8ca1f341303f46e4d438&chksm=cf56518af821d89c0b0ba592d405a23191eea804328d44250872a50e992e3c2d390a0e36b779&token=447597716&lang=zh_CN#rd) * [【漫画高并发】工作了3年的程序员小菜面试高并发岗位被吊打虐哭!](https://mp.weixin.qq.com/s?__biz=Mzg4MjU0OTM1OA==&mid=2247493121&idx=1&sn=cfc5676acd07a039c3716859479d9f46&chksm=cf565100f821d8168c2191fe6ab00590e657c96ad16727b708586c759dc568cae35065b2bd3b&token=1352513703&lang=zh_CN#rd) * [刚研究完Callable和Future,各位随便问!!](https://mp.weixin.qq.com/s?__biz=Mzg4MjU0OTM1OA==&mid=2247492717&idx=1&sn=14f9818c02884143e5a249a8eb6f541a&chksm=cf56536cf821da7a07e305e16938b0d754af05485498e68ba2b9c8adc711189eb56f3af50740&token=1574716246&lang=zh_CN#rd) * [这部电子书凭什么短短几个月全网累计下载突破16万?](https://mp.weixin.qq.com/s?__biz=Mzg4MjU0OTM1OA==&mid=2247492194&idx=1&sn=536d8ac9c4ab6f5bb45a156ccb671f90&chksm=cf565563f821dc75a6ed888a5bec1ab1c3371e67fe8154d7f03903bf14b3767d77337fcab505&token=1093611389&lang=zh_CN#rd) * [更正《深入理解高并发编程(第1版)》中的一处错误!](https://mp.weixin.qq.com/s?__biz=Mzg4MjU0OTM1OA==&mid=2247491931&idx=1&sn=63c272ddb2e6fefe1beeec0a9a748722&chksm=cf56565af821df4ce5c03fe29daf83dd3dfb55a0f20355c2ec90a922d380b965a503dfb59b3f&token=1093611389&lang=zh_CN#rd) * [冰河写的这部《深入理解高并发编程》电子书全网已累计下载15W+!!](https://mp.weixin.qq.com/s?__biz=Mzg4MjU0OTM1OA==&mid=2247491329&idx=1&sn=fa306c598453149daf646655236f01ed&chksm=cf55a800f8222116c05cb7f8b2036842aa8951e65237a5b1e86d597e5ab3273613e29ee45b10&token=776659970&lang=zh_CN#rd) * [高并发场景下如何优化服务器的性能?](https://mp.weixin.qq.com/s?__biz=Mzg4MjU0OTM1OA==&mid=2247490255&idx=1&sn=714509ac9747898d34489045e382387c&chksm=cf55adcef82224d8f2570dd02bd1bb1c3c23d1c564a820f09eb5158c6632d5faaf4bf716a741&token=776659970&lang=zh_CN#rd) * [ReadWriteLock怎么和缓存扯上关系了?!](https://mp.weixin.qq.com/s?__biz=Mzg4MjU0OTM1OA==&mid=2247490372&idx=1&sn=96f0ed94f9b0c8c19eb5586f0ff20e30&chksm=cf55ac45f82225538854df50ea3817cd5b07d90d59db9bd5259b1502bce4e795c0cdafc19afb&token=776659970&lang=zh_CN#rd) * [肝了三天三夜整理出这份36万字全网最牛的开源高并发编程PDF!!](https://mp.weixin.qq.com/s?__biz=Mzg4MjU0OTM1OA==&mid=2247489139&idx=1&sn=17096da9642b01ca8311e07a119ecec0&chksm=cf55a172f82228642dee1571a8bd1e6b82ab5abeb8329b5e7f6f834fe2ee9da4ecff84ca28fa&token=776659970&lang=zh_CN#rd) * [冰河的并发编程核心知识(源码分析篇)电子书开源啦(文末免费领取)!!](https://mp.weixin.qq.com/s?__biz=Mzg4MjU0OTM1OA==&mid=2247489158&idx=1&sn=8827ff6e97807154446565882ce7e487&chksm=cf55a187f82228917dbce2cfbce900bc5140dc0f321a2a6d504a40f6def1b2801d5aa6481581&token=776659970&lang=zh_CN#rd) * [线程与多线程](https://mp.weixin.qq.com/s?__biz=Mzg4MjU0OTM1OA==&mid=2247489407&idx=1&sn=fbbd0e6970064aecd0dcd603eb8489e3&chksm=cf55a07ef82229686ec55d5bbe4dabc658524e073a27fff2b1d86f0d8f56cf002b2cc0f548f5&token=776659970&lang=zh_CN#rd) * [线程的执行顺序](https://mp.weixin.qq.com/s?__biz=Mzg4MjU0OTM1OA==&mid=2247489406&idx=1&sn=476f4eca8e42190d837077c97e659ad3&chksm=cf55a07ff82229692bc063cff0a57036e607c496a5dd163f70e57e1348a3a9fda6c30a01e9aa&token=776659970&lang=zh_CN#rd) * [深入解析Callable接口](https://mp.weixin.qq.com/s?__biz=Mzg4MjU0OTM1OA==&mid=2247489405&idx=1&sn=a71ece73ef42b4c1cdd4858627a96879&chksm=cf55a07cf822296aa2036a7144f62120981d4ff9a78ba1af1b35cb794fd4326284a3f9f9a900&token=776659970&lang=zh_CN#rd) * [两种异步模型与深度解析Future接口](https://mp.weixin.qq.com/s?__biz=Mzg4MjU0OTM1OA==&mid=2247489404&idx=1&sn=9070d83bb4180c83b92597c066d68c12&chksm=cf55a07df822296b328b494659d33d34c3765de0fa441372ea4b9dad3541343852d006b9d07f&token=776659970&lang=zh_CN#rd) * [不得不说的线程池与ThreadPoolExecutor类浅析](https://mp.weixin.qq.com/s?__biz=Mzg4MjU0OTM1OA==&mid=2247489402&idx=1&sn=2e7ed9ee117399a28afeda9877a8db5f&chksm=cf55a07bf822296d6699f0a5bf6da5b7050f8398aa434acd4af604b309230ea631f43ab7c5de&token=776659970&lang=zh_CN#rd) * [P8级别架构师带你深度解析线程池中那些重要的顶层接口和抽象类](https://mp.weixin.qq.com/s?__biz=Mzg4MjU0OTM1OA==&mid=2247489401&idx=1&sn=e54d6898cc194c4f713a3318b2172e6e&chksm=cf55a078f822296e6b66c23154339b449b05b22f68b62997ea9513d5e6131e73d22219a0e79b&token=776659970&lang=zh_CN#rd) * [创建线程池居然有这么多方式...](https://mp.weixin.qq.com/s?__biz=Mzg4MjU0OTM1OA==&mid=2247489397&idx=1&sn=57d3e04af3e52271e65b4247d72d65c0&chksm=cf55a074f8222962991e210912148f396a0bc01bb98ed39ac02085d289831bff9167cf084bf5&token=776659970&lang=zh_CN#rd) * [ThreadPoolExecutor类居然是这样保证线程池正确运行的...](https://mp.weixin.qq.com/s?__biz=Mzg4MjU0OTM1OA==&mid=2247489398&idx=1&sn=307c7b3d78aa275a682263bd218f9a1d&chksm=cf55a077f822296112d619dbc1641725d59c7b826ae59cb3216aa2bcf98fd15e100ded7d2fce&token=776659970&lang=zh_CN#rd) * [通过ThreadPoolExecutor类的源码深度解析线程池执行任务的核心流程](https://mp.weixin.qq.com/s?__biz=Mzg4MjU0OTM1OA==&mid=2247489396&idx=1&sn=50a2e8ffb5f303a9aff01b85f6f098a5&chksm=cf55a075f8222963c2780ab99726c96f185cb7973cc8a9e7189413010bc8a3f9713e4de8c656&token=776659970&lang=zh_CN#rd) * [通过源码深度分析线程池中Worker线程的执行流程](https://mp.weixin.qq.com/s?__biz=Mzg4MjU0OTM1OA==&mid=2247489395&idx=1&sn=c18513f458ee9d09126c4f55e3e4151b&chksm=cf55a072f82229645b553a6135845048f221f56ef5cf84467bb1c66b4e0c0c80bd974728d7a9&token=776659970&lang=zh_CN#rd) * [从源码角度深度解析线程池是如何实现优雅退出的](https://mp.weixin.qq.com/s?__biz=Mzg4MjU0OTM1OA==&mid=2247489394&idx=1&sn=316cd39a28b9ad465807963a15f38bd4&chksm=cf55a073f8222965c582a9b5b853176603cdba49ac56d7a80379a92d7e03cde3620d290324cb&token=776659970&lang=zh_CN#rd) * [ScheduledThreadPoolExecutor与Timer的区别和简单示例](https://mp.weixin.qq.com/s?__biz=Mzg4MjU0OTM1OA==&mid=2247489393&idx=1&sn=9a7eac3f87baacf5befa311d480d91c7&chksm=cf55a070f822296601d4651061db538681d03ff87d5b132adb73ebb0519f03dec527d0a65ebc&token=776659970&lang=zh_CN#rd) * [深度解析ScheduledThreadPoolExecutor类的源代码](https://mp.weixin.qq.com/s?__biz=Mzg4MjU0OTM1OA==&mid=2247489392&idx=1&sn=dcdcc00845dccdefb9241bcac87a3b44&chksm=cf55a071f8222967140c4c8d21893223d7d018b3edda4c0a213df19846d8b6c1a1c101fe37db&token=776659970&lang=zh_CN#rd) * [深度解析ScheduledFutureTask类源码](https://mp.weixin.qq.com/s?__biz=Mzg4MjU0OTM1OA==&mid=2247489391&idx=1&sn=a7f71046ba9552c27a0f29bace2142cb&chksm=cf55a06ef822297853c1c69aaec03c31388206020f1ead99fd306cfd5622f1b3cc13d06bf081&token=776659970&lang=zh_CN#rd) * [多线程的三大特性(原子性、可见性和有序性)](https://mp.weixin.qq.com/s?__biz=Mzg4MjU0OTM1OA==&mid=2247489390&idx=1&sn=94ed4db3fccee1c74ad8d19c8f846334&chksm=cf55a06ff822297986d39fd4417b92086f5924e5a6cae55ae90ce2a110c43110130e6579c192&token=776659970&lang=zh_CN#rd) * [浅谈AQS中的CountDownLatch、Semaphore与CyclicBarrier](https://mp.weixin.qq.com/s?__biz=Mzg4MjU0OTM1OA==&mid=2247489374&idx=1&sn=b75f2b908a6e8153b6f1627fccd9231a&chksm=cf55a05ff8222949658cf04198ea93fd5a123e1c1301b48a94451a84379efcf1e2e1e8bf76c2&token=776659970&lang=zh_CN#rd) * [浅谈AQS中的Lock与Condition](https://mp.weixin.qq.com/s?__biz=Mzg4MjU0OTM1OA==&mid=2247489373&idx=1&sn=5bce8306b75780f2dd3d6ea285b1ee91&chksm=cf55a05cf822294a55205ee04c2636485abe6789a65d0e3156e3b908de8cf93597721af0fecb&token=776659970&lang=zh_CN#rd) * [一个工作了7年的朋友去面试竟然被Thread类虐的体无完肤](https://mp.weixin.qq.com/s?__biz=Mzg4MjU0OTM1OA==&mid=2247489372&idx=1&sn=d7987d839975c66fcb061285df5d4d38&chksm=cf55a05df822294b8e034b19bfe003836bff4a4cab14e66336bb618919bcb1b7c1eb2d0f3933&token=776659970&lang=zh_CN#rd) * [如何使用Java7中提供的Fork/Join框架实现高并发程序?](https://mp.weixin.qq.com/s?__biz=Mzg4MjU0OTM1OA==&mid=2247489370&idx=1&sn=35533ba6aaad0c727f936f0d3751a8ee&chksm=cf55a05bf822294d003bd8e7b2a992b6dabfbb268f965a3b312399e5e5ae80dec59f61c76675&token=776659970&lang=zh_CN#rd) * [要想学好并发编程,关键是要理解这三个核心问题](https://mp.weixin.qq.com/s?__biz=Mzg4MjU0OTM1OA==&mid=2247489368&idx=1&sn=e856feb5e9c582e865fddbc91b91a636&chksm=cf55a059f822294fb30c4d1b016c7861a16668706cbae4e14acdaec6a093dfc647b4df6dd7d4&token=776659970&lang=zh_CN#rd) * [工作了3年的程序员小菜面试高并发岗位被吊打虐哭。。。](https://mp.weixin.qq.com/s?__biz=Mzg4MjU0OTM1OA==&mid=2247489367&idx=1&sn=f23516aa06019130969b619e5fd84d83&chksm=cf55a056f82229401944d8de0583d5535f824597535fab1c04e427f16493480b00985adc7cb0&token=776659970&lang=zh_CN#rd) * [导致并发编程频繁出问题的“幕后黑手”](https://mp.weixin.qq.com/s?__biz=Mzg4MjU0OTM1OA==&mid=2247489366&idx=1&sn=dc827567e6f8ac2beddc91a892fa8e12&chksm=cf55a057f8222941cc026506828b90d781fd6733e7db68a8a8417d40a8de2e0cf96a7ac8d11f&token=776659970&lang=zh_CN#rd) * [解密导致诡异并发问题的第一个幕后黑手——可见性问题](https://mp.weixin.qq.com/s?__biz=Mzg4MjU0OTM1OA==&mid=2247489365&idx=1&sn=f6b7b9f70b8653e7156fdf7c34c4a74d&chksm=cf55a054f822294277d9d917b3020a1b218cb6bc81fd8b5521867c7bbb1a8bba19a2af9b3c10&token=776659970&lang=zh_CN#rd) * [解密导致并发问题的第二个幕后黑手——原子性问题](https://mp.weixin.qq.com/s?__biz=Mzg4MjU0OTM1OA==&mid=2247489364&idx=1&sn=fdeca5407f03b9283bf7699b7882517b&chksm=cf55a055f8222943bf0160f955a10e47693a1015a6c823184fa38f2b7acf37c2d743a2664f91&token=776659970&lang=zh_CN#rd) * [解密导致并发问题的第三个幕后黑手——有序性问题](https://mp.weixin.qq.com/s?__biz=Mzg4MjU0OTM1OA==&mid=2247489363&idx=1&sn=247470d979a1180503a082e53259c5f5&chksm=cf55a052f82229442551f6c9076b8b2c977cc82384524e455ccc7fb1bfb0f732ba11b1557773&token=776659970&lang=zh_CN#rd) * [何为Happens-Before原则?这次彻底懂了!](https://mp.weixin.qq.com/s?__biz=Mzg4MjU0OTM1OA==&mid=2247489362&idx=1&sn=fa49532eee6faf48eeb7544578748acb&chksm=cf55a053f82229458d23f52e0ccf4ac539ea8f77d61126817357dfed4f990031f3fe595f8b52&token=776659970&lang=zh_CN#rd) * [如何解决可见性和有序性问题?这次彻底懂了!](https://mp.weixin.qq.com/s?__biz=Mzg4MjU0OTM1OA==&mid=2247489361&idx=1&sn=9c58d50a127c73eda48770de2947723d&chksm=cf55a050f82229460e8d4eb994974af7b8fc20a8881fb10bfdace3d9f8d0aecc5f75921d545b&token=776659970&lang=zh_CN#rd) * [synchronized原理是什么?这次彻底懂了!!](https://mp.weixin.qq.com/s?__biz=Mzg4MjU0OTM1OA==&mid=2247489360&idx=1&sn=15a678c02e771fd6c48d16a90fe98a7a&chksm=cf55a051f82229472747c67f4b11a74282f414f026b6b1cbde72caae2e30882026e99fb8ae96&token=776659970&lang=zh_CN#rd) * [一分钟了解什么是限流](https://mp.weixin.qq.com/s?__biz=Mzg4MjU0OTM1OA==&mid=2247489359&idx=1&sn=c233458a8e7c612748bf520a7af457d0&chksm=cf55a04ef8222958fe57ae5c989edbfb1c3cfef36cb0c2c4b519c65414f3f0cb51e80ed266a9&token=776659970&lang=zh_CN#rd) * [32位多核CPU并发读写long型数据为何会出现诡异问题?看完这篇我懂了!](https://mp.weixin.qq.com/s?__biz=Mzg4MjU0OTM1OA==&mid=2247489358&idx=1&sn=de3775195aa7503d56ab3cad8655516a&chksm=cf55a04ff822295974e9479b5c75fdc5e93dea7bcc11aa67ef92c0e53bef608c1940dd17a30a&token=776659970&lang=zh_CN#rd) * [如何使用互斥锁解决多线程的原子性问题?这次终于明白了!](https://mp.weixin.qq.com/s?__biz=Mzg4MjU0OTM1OA==&mid=2247489357&idx=1&sn=39aa066bc585e7afd4a6c05f660fc70d&chksm=cf55a04cf822295add93717a88bf29b0905d24a80de21c6bc3a2b92425a67909ba493c20b254&token=776659970&lang=zh_CN#rd) * [不可不说的几种限流算法](https://mp.weixin.qq.com/s?__biz=Mzg4MjU0OTM1OA==&mid=2247489356&idx=1&sn=63c472455df876c7c23a492a864d2549&chksm=cf55a04df822295b7da0f2ffc38853623254a0e28a691978b8e2a9d3b1dfef7430bab00756a3&token=776659970&lang=zh_CN#rd) * [ThreadPoolExecutor类的核心流程](https://mp.weixin.qq.com/s?__biz=Mzg4MjU0OTM1OA==&mid=2247489355&idx=1&sn=e4b48fd6d3bef8a3e5ae96d97bbefcb2&chksm=cf55a04af822295c0ba45518ef8df7e5e5a097658f085ab53445648f0b2d60f23084e8488d81&token=776659970&lang=zh_CN#rd) * [ThreadLocal学会了这些,你也能和面试官扯皮了!](https://mp.weixin.qq.com/s?__biz=Mzg4MjU0OTM1OA==&mid=2247489354&idx=1&sn=17f589a29e2b6124cb9e223e0ea88e6c&chksm=cf55a04bf822295d010cc3c1744df1327db442ddcdcac8af21ba4b21dc6fbac9081a0c75d95c&token=776659970&lang=zh_CN#rd) * [高并发秒杀系统架构解密,不是所有的秒杀都是秒杀!](https://mp.weixin.qq.com/s?__biz=Mzg4MjU0OTM1OA==&mid=2247489353&idx=1&sn=768ebcd3612a0c04465367751473a737&chksm=cf55a048f822295e892b7f9a55284e82d80db52b18afc53a1a3b1868d452e4286b156b9aaefb&token=776659970&lang=zh_CN#rd) * [高并发环境下诡异的加锁问题(你加的锁未必安全)](https://mp.weixin.qq.com/s?__biz=Mzg4MjU0OTM1OA==&mid=2247489352&idx=1&sn=98884d07d5d237c3bc7ff183c08c93cb&chksm=cf55a049f822295fcf9352de03fd910557fd4a800ced8caae65e64836233689d891d682397f6&token=776659970&lang=zh_CN#rd) * [什么是ForkJoin?看这一篇就够了!](https://mp.weixin.qq.com/s?__biz=Mzg4MjU0OTM1OA==&mid=2247489351&idx=1&sn=0d601cd93684fb6ae6c391df64086a63&chksm=cf55a046f822295004e1185f2772105938df02c38c8a54f167b419cdacc164d126b107c87dd9&token=776659970&lang=zh_CN#rd) * [优化加锁方式时竟然死锁了!!](https://mp.weixin.qq.com/s?__biz=Mzg4MjU0OTM1OA==&mid=2247489350&idx=1&sn=a7b0a8f5a3a02d9b164090c782129a88&chksm=cf55a047f8222951d867043364d3f96a1a3ebc2a15d8b357ad84baef3f189f0ca0c7a674fbcd&token=776659970&lang=zh_CN#rd) * [你知道吗?大家都在使用Redisson实现分布式锁了!!](https://mp.weixin.qq.com/s?__biz=Mzg4MjU0OTM1OA==&mid=2247489349&idx=1&sn=9e8afb96f1d063dcc7354e505f773aa5&chksm=cf55a044f82229523d85cf4ebd0e6ca93c36ac74290d5d5ef59edc1d7b65040274a780f53d42&token=776659970&lang=zh_CN#rd) * [为何高并发系统中都要使用消息队列?这次彻底懂了!](https://mp.weixin.qq.com/s?__biz=Mzg4MjU0OTM1OA==&mid=2247489348&idx=1&sn=f6816c0afd29da001d9491eeebba5059&chksm=cf55a045f82229539264f0aed0d18a478152558accd6b35131e85247eca50c06167fb6becee0&token=776659970&lang=zh_CN#rd) * [在高并发环境下该如何构建应用级缓存?](https://mp.weixin.qq.com/s?__biz=Mzg4MjU0OTM1OA==&mid=2247489347&idx=1&sn=0c0129168d9fffaee4578111ce037e11&chksm=cf55a042f8222954843b52772318ab2cc1a5d32dc6e9e769af12e6f2c8b97094c418105d9476&token=776659970&lang=zh_CN#rd) * [高并发环境下如何优化Tomcat性能?看完我懂了!](https://mp.weixin.qq.com/s?__biz=Mzg4MjU0OTM1OA==&mid=2247489346&idx=1&sn=142d2b2a1ae798e99a11d1790ea9b140&chksm=cf55a043f822295549c962657928b41b114a8cbdd57e2dbea5b0c63958a9ac6a6ff7996c247a&token=776659970&lang=zh_CN#rd) * [不废话,言简意赅介绍BlockingQueue](https://mp.weixin.qq.com/s?__biz=Mzg4MjU0OTM1OA==&mid=2247489345&idx=1&sn=9535aa3ee6d7c85f03b82c1cacf7a036&chksm=cf55a040f8222956a7c22c3265111496aea8a4100ff47839e86fc8f479a3c878b3ca4e653822&token=776659970&lang=zh_CN#rd) * [高并发场景下创建多少线程才合适?一条公式帮你搞定!!](https://mp.weixin.qq.com/s?__biz=Mzg4MjU0OTM1OA==&mid=2247489344&idx=1&sn=be3cb8045217b8c1959d0dcb5291c0c9&chksm=cf55a041f82229577c628e0ddfcc262647cf085fef008399e3ae4d94ad19dbc5bf06d19eabce&token=776659970&lang=zh_CN#rd) * [高并发场景下如何优化加锁方式?看完这篇我确实明白了!!](https://mp.weixin.qq.com/s?__biz=Mzg4MjU0OTM1OA==&mid=2247489343&idx=1&sn=5a3ff74875272d8a72a80037d25d9aad&chksm=cf55a03ef822292878e2aea6dad98d7d2e39c969f4f2f1ed608e76a71bdfac671f41489c15b9&token=776659970&lang=zh_CN#rd) * [如何使用Java7中提供的Fork/Join框架实现高并发程序?](https://mp.weixin.qq.com/s?__biz=Mzg4MjU0OTM1OA==&mid=2247489342&idx=1&sn=d1143e36dcff7bb209a36c0ee0390068&chksm=cf55a03ff8222929684aa9fcec7eee6686b402f426c5f5c3959f3c2b99559ae0f60a35be3038&token=776659970&lang=zh_CN#rd) * [高并发分布式锁架构解密,不是所有的锁都是分布式锁!!](https://mp.weixin.qq.com/s?__biz=Mzg4MjU0OTM1OA==&mid=2247489340&idx=1&sn=9658cbf46be5c8a1dcb6f27f7cbd2483&chksm=cf55a03df822292b3d9cee0ab031b5f706daf2755ec88eaf38242016b04b989f0bf00be4ae63&token=776659970&lang=zh_CN#rd) * [终于弄懂为什么局部变量是线程安全的了!!](https://mp.weixin.qq.com/s?__biz=Mzg4MjU0OTM1OA==&mid=2247489339&idx=1&sn=56a2340cf337ab5cb3f7f66b98ffde52&chksm=cf55a03af822292ce9bc4b1ff334ef9a01f011b6039c7b2e8b9f7f85317688209f675bbe440d&token=776659970&lang=zh_CN#rd) * [信不信?以面向对象的思想是可以写好高并发程序的!](https://mp.weixin.qq.com/s?__biz=Mzg4MjU0OTM1OA==&mid=2247489338&idx=1&sn=5747a4ae08e8057e0771afe84ea67ae0&chksm=cf55a03bf822292d6aaa7dcf3910c8d58a5a4109190818dcdcc874d414d0464703a83c8f3b9b&token=776659970&lang=zh_CN#rd) * [何为Happens-Before原则?这次彻底懂了!](https://mp.weixin.qq.com/s?__biz=Mzg4MjU0OTM1OA==&mid=2247489215&idx=1&sn=50bf075cbbd35539e6e65ea54c0f03b4&chksm=cf55a1bef82228a87ba23f617816136806c1d52cfdd781e4e118ae7da105c54cd3e3b647dd00&token=776659970&lang=zh_CN#rd) * [又一个朋友面试栽在了Thread类的stop()方法和interrupt()方法上!](https://mp.weixin.qq.com/s?__biz=Mzg4MjU0OTM1OA==&mid=2247489337&idx=1&sn=82283e6bafb72b34979d45678911f41c&chksm=cf55a038f822292e85e95244ad359f331b51a14507d6ef94e2ce1343d4281595ee011bcf8289&token=776659970&lang=zh_CN#rd) * [由InterruptedException异常引发的思考](https://mp.weixin.qq.com/s?__biz=Mzg4MjU0OTM1OA==&mid=2247489336&idx=1&sn=796d3889093e667b2714fda3c83474c4&chksm=cf55a039f822292fde8b1e7a29b1f77990818408992d1529b804f85b77b73e1c98fcedf4dc97&token=776659970&lang=zh_CN#rd) * [朋友问我学习高并发需不需要阅读源码,我是这样分析的!!](https://mp.weixin.qq.com/s?__biz=Mzg4MjU0OTM1OA==&mid=2247489335&idx=1&sn=ffc3c88746f28d886811b34760f32782&chksm=cf55a036f82229206134a02be74fb6c18441e0fcda700ef060241da1ccce60c9278772836e36&token=776659970&lang=zh_CN#rd) * [线程的生命周期其实没有我们想象的那么简单!!](https://mp.weixin.qq.com/s?__biz=Mzg4MjU0OTM1OA==&mid=2247489334&idx=1&sn=f98baf5bab13e35493c77485fbb3f05c&chksm=cf55a037f8222921cf534290ab39d2f2d2c6e38a41d5761e399f1240ee238b8f54735953d499&token=776659970&lang=zh_CN#rd) * [你敢信??HashMap竟然干掉了CPU!!](https://mp.weixin.qq.com/s?__biz=Mzg4MjU0OTM1OA==&mid=2247489325&idx=1&sn=87cb0f267ef101f26a5d47a6a5d8653b&chksm=cf55a02cf822293a6b4cf124bf4c3374deba32397249e90706de97acf4910aebfe9ccacf19d7&token=776659970&lang=zh_CN#rd) * [ThreadLocal学会了这些,你也能和面试官扯皮了!](https://mp.weixin.qq.com/s?__biz=Mzg4MjU0OTM1OA==&mid=2247489324&idx=1&sn=0a35687e86500cf5a7e3831a703acd7d&chksm=cf55a02df822293b0f96e98f12f3d0fc99063b50eff5ec01d3291c1cab2a89e7c47370938706&token=776659970&lang=zh_CN#rd) * [为何高并发系统中都要使用消息队列?这次彻底懂了!](https://mp.weixin.qq.com/s?__biz=Mzg4MjU0OTM1OA==&mid=2247489274&idx=1&sn=474d6b4fedbb50a9c9a0fdd5ed7e34b5&chksm=cf55a1fbf82228ed1fec52eaf8e3d076f6f40322038c881929c47e0e671cd36d966b7b6422ed&token=776659970&lang=zh_CN#rd) * [学习高并发我们究竟要不要读源码?别纠结,肝就完了!](https://mp.weixin.qq.com/s?__biz=Mzg4MjU0OTM1OA==&mid=2247489273&idx=1&sn=6d3f38b52de0a60d78a18c4ea14133c4&chksm=cf55a1f8f82228ee84612c8fbc0cb4fd00f67f29ddc2845931d5f050e61c4b68e031d8074f7b&token=776659970&lang=zh_CN#rd) * [面试官问我如何使用Nginx实现限流,我如此回答轻松拿到了Offer!](https://mp.weixin.qq.com/s?__biz=Mzg4MjU0OTM1OA==&mid=2247489271&idx=1&sn=00c81bed5650110649483188d4bf75a5&chksm=cf55a1f6f82228e0168a3ea55043390b682b591ab2083be9b4153be073e082fc01e486ee0db0&token=776659970&lang=zh_CN#rd) * [如何设计一个支撑高并发大流量的系统?这次我将设计思路分享给大家!](https://mp.weixin.qq.com/s?__biz=Mzg4MjU0OTM1OA==&mid=2247489254&idx=1&sn=91c6d5c829d5eb0b4c95a22752f8f5bc&chksm=cf55a1e7f82228f1b4d0d2ec67c6036046e9873f89bc77bdb57f39ea5912185ac2130b0511a9&token=776659970&lang=zh_CN#rd) * [高并发环境下构建缓存服务需要注意哪些问题?我和阿里P9聊了很久!](https://mp.weixin.qq.com/s?__biz=Mzg4MjU0OTM1OA==&mid=2247489253&idx=1&sn=78aa33212e7ce5d5bdbeb07e5baf2aaf&chksm=cf55a1e4f82228f2f1c9ed1e1685534f64319ccb3837b40497dca1f2f984d6145ad75cdb242e&token=776659970&lang=zh_CN#rd) * [如何实现亿级流量下的分布式限流?这些理论你必须掌握!!](https://mp.weixin.qq.com/s?__biz=Mzg4MjU0OTM1OA==&mid=2247489252&idx=1&sn=7fbef4c1a27fc01f865153adf409370c&chksm=cf55a1e5f82228f3df2f7a5f835c054fa9be1508f5c4b00bb1912418bcdce86510ba15f3fbd8&token=776659970&lang=zh_CN#rd) * [如何实现亿级流量下的分布式限流?这些算法你必须掌握!!](https://mp.weixin.qq.com/s?__biz=Mzg4MjU0OTM1OA==&mid=2247489251&idx=1&sn=e4445792085261c1515a0d02d8645bbe&chksm=cf55a1e2f82228f43039ba9916270e9efc2324949bc01d783c19492bd372b6645941dd85b632&token=776659970&lang=zh_CN#rd) * [亿级流量场景下如何为HTTP接口限流?看完我懂了!!](https://mp.weixin.qq.com/s?__biz=Mzg4MjU0OTM1OA==&mid=2247489250&idx=1&sn=23a68c11f125c0b5bcda767df5965101&chksm=cf55a1e3f82228f5708fa09147f7997ac64b5587c095ede1eca2c9699d8d13919ce26afea63a&token=776659970&lang=zh_CN#rd) * [亿级流量场景下如何实现分布式限流?看完我彻底懂了!!](https://mp.weixin.qq.com/s?__biz=Mzg4MjU0OTM1OA==&mid=2247489249&idx=1&sn=bd7b103f9bece48e8b0ab41a6d8fefd7&chksm=cf55a1e0f82228f673a40be35c3dddea15941849bd026e6dc0b044de76a28a7f7787784048f4&token=776659970&lang=zh_CN#rd) * [Redis如何助力高并发秒杀系统?看完这篇我彻底懂了!!](https://mp.weixin.qq.com/s?__biz=Mzg4MjU0OTM1OA==&mid=2247489194&idx=1&sn=ea0631cc03ae8e57c002e94cec9472b9&chksm=cf55a1abf82228bdcd3d690b50a423097c959a5993b529a85dc58baea5d54f1c0330c3e4423e&token=776659970&lang=zh_CN#rd) * [面试官:Java中提供了synchronized,为什么还要提供Lock呢?](https://mp.weixin.qq.com/s?__biz=Mzg4MjU0OTM1OA==&mid=2247489191&idx=1&sn=77c3ac60313a2e54e4cb719264f02cb4&chksm=cf55a1a6f82228b0094495740752f02e0614ba35bf543ad06b89873f6bce461f9a5d46b485a6&token=776659970&lang=zh_CN#rd) * [一文搞懂PV、UV、VV、IP及其关系与计算](https://mp.weixin.qq.com/s?__biz=Mzg4MjU0OTM1OA==&mid=2247489164&idx=1&sn=d35d5ddb37f3d35e453bc6297cffbed9&chksm=cf55a18df822289bfd6b676acb73febc764e8a1c959d1d167523dff92d48150fbcbf21d3f016&token=776659970&lang=zh_CN#rd) * [学好并发编程,需要掌握这些核心知识体系!!](https://mp.weixin.qq.com/s?__biz=Mzg4MjU0OTM1OA==&mid=2247489161&idx=1&sn=4e5cb69ffd54e4bc083191dc44313d95&chksm=cf55a188f822289e0e2184932c3244bd3016c0011ce6b3865e219bdf372149cfc8d9a9653854&token=776659970&lang=zh_CN#rd) * [高并发秒杀系统架构解密,不是所有的秒杀都是秒杀!](https://mp.weixin.qq.com/s?__biz=Mzg4MjU0OTM1OA==&mid=2247489157&idx=1&sn=6e76a0608f7873e4e34f0a67f2d94ec2&chksm=cf55a184f8222892ab4b82791b1ae300957720f6ea59624c497328576e2e05456fe6935ba0d8&token=776659970&lang=zh_CN#rd) * [讲讲高并发场景下如何优化加锁方式?](https://mp.weixin.qq.com/s?__biz=Mzg4MjU0OTM1OA==&mid=2247489156&idx=1&sn=28e9fa305c28ecff85d3d2e6f6a45ed9&chksm=cf55a185f822289370f55ba9e42403bc17aef087bde3c358d1e24c9c991b384284fbc0699e92&token=776659970&lang=zh_CN#rd) * [高并发分布式锁架构解密,不是所有的锁都是分布式锁(升级版)!!](https://mp.weixin.qq.com/s?__biz=Mzg4MjU0OTM1OA==&mid=2247489154&idx=1&sn=22faf83caf06638dd5f3547b08340fbe&chksm=cf55a183f8222895a237c3d0b4b6ddb92ed4d93600c37b1edd413716aed465bb60501bfa5d39&token=776659970&lang=zh_CN#rd) * [学好并发编程,关键是要理解这三个核心问题!!](https://mp.weixin.qq.com/s?__biz=Mzg4MjU0OTM1OA==&mid=2247489151&idx=1&sn=b25b6a43c301cd02df48232d2c89c0d9&chksm=cf55a17ef82228684795669e2564be75d768968e7d2b7eabaae81d837c4ab53103ea95f46f4f&token=776659970&lang=zh_CN#rd) * [为啥局部变量是线程安全的?](https://mp.weixin.qq.com/s?__biz=Mzg4MjU0OTM1OA==&mid=2247489144&idx=1&sn=77bc4f740e68c7b706fb2a1619d85352&chksm=cf55a179f822286f78b604c3ecdd9cf50e1315f655c552b0fd999218b5671aff7a7636893b98&token=776659970&lang=zh_CN#rd) * [并发编程到底应该学什么?一张图秒懂!!](https://mp.weixin.qq.com/s?__biz=Mzg4MjU0OTM1OA==&mid=2247489130&idx=1&sn=981c1bbf315127a94d5152ac7706cda4&chksm=cf55a16bf822287d8e5988874de65001590ff2915cf3092e837c866bb3111f752cc8b71f9ef8&token=776659970&lang=zh_CN#rd) * [并发编程中一种经典的分而治之的思想!!](https://mp.weixin.qq.com/s?__biz=Mzg4MjU0OTM1OA==&mid=2247489094&idx=1&sn=9cd9c4efeb31d96cde375f15ea8e6ccb&chksm=cf55a147f8222851080c47f4dd5a4ef3aff6d2bdc7d9be3640f8dd9ab05e2ec32210719bba38&token=776659970&lang=zh_CN#rd) * [我们该如何正确的中断一个正在执行的线程??](https://mp.weixin.qq.com/s?__biz=Mzg4MjU0OTM1OA==&mid=2247489540&idx=1&sn=c86aa20794d406f11ce79b40ce356ef3&chksm=cf55af05f822261382ad28c05d0eb4374e3a941ee258ff09f78e5d0eceeb90b54a9cb5380825&token=776659970&lang=zh_CN#rd) * [内存模型(八种操作+同步规则)](https://mp.weixin.qq.com/s?__biz=Mzg4MjU0OTM1OA==&mid=2247489468&idx=1&sn=d3f45ddb7405b36eaaadb45a4ee1672e&chksm=cf55a0bdf82229ab8f92282d836c0be4697ed7d0aacfc1fa3775db0cccd7089e187351b85d8d&token=776659970&lang=zh_CN#rd) * [如何安全的发布对象(含各种单例代码分析)](https://mp.weixin.qq.com/s?__biz=Mzg4MjU0OTM1OA==&mid=2247489467&idx=1&sn=47ae1d1de0468eb6b6f9a96691fe2a8c&chksm=cf55a0baf82229ac617384c100a4279b4107c6a6758e1001965f34df274c8bb27ce4c9b47ecf&token=776659970&lang=zh_CN#rd) ### 手撸源码系列 * [这次我设计了一款TPS百万级别的分布式、高性能、可扩展的RPC框架](https://mp.weixin.qq.com/s/5HOUg49X0xQmkQjMiWnaIg) * [手撸一款属于自己的Maven插件,说干就干!!](https://mp.weixin.qq.com/s?__biz=Mzg4MjU0OTM1OA==&mid=2247497660&idx=1&sn=fb74836a052e0433ce64db1fcc5c3524&chksm=cf5640bdf821c9ab369207fa39caede4af0c822754203b4431b7f190699495b77c9dba859e09&token=1170633251&lang=zh_CN#rd) ### 🔥🔥🔥 SA实战 #### 第一篇:专栏设计 * [SA实战 ·《SpringCloud Alibaba实战》第1章-专栏设计:专栏开篇](https://mp.weixin.qq.com/s/1kS9RImTuL4hCnbxgfY1HQ) * [SA实战 ·《SpringCloud Alibaba实战》第2章-专栏设计:专栏总体设计](https://mp.weixin.qq.com/s/k93hPaozpq0yZyhzTn3AqA) #### 第二篇:微服务介绍 * [SA实战 ·《SpringCloud Alibaba实战》第3章-微服务介绍:微服务核心技术介绍](https://mp.weixin.qq.com/s/jT5YqhMakXh8Fe5lOASwUQ) #### 第三篇:微服务环境搭建 * [SA实战 ·《SpringCloud Alibaba实战》第4章-微服务环境搭建:项目说明、流程设计、技术选型与模块划分](https://mp.weixin.qq.com/s/WFFDr1gHrYNN17xk3MyYjw) * [SA实战 ·《SpringCloud Alibaba实战》第5章-微服务环境搭建:开撸:完成通用模块的开发](https://mp.weixin.qq.com/s/BoKOEKzmJmHqVtgYBZyp-w) * [SA实战 ·《SpringCloud Alibaba实战》第6章-微服务环境搭建:快速搭建三大微服务并完成交互开发与测试](https://mp.weixin.qq.com/s/9baarUeDNOsRKraOzV_wyA) #### 第四篇:服务治理 * [SA实战 ·《SpringCloud Alibaba实战》第7章-服务治理:实现服务的自动注册与发现](https://mp.weixin.qq.com/s/gKoEHbdcnGqW6fKdUFyUBg) * [SA实战 ·《SpringCloud Alibaba实战》第8章-服务治理:实现服务调用的负载均衡](https://mp.weixin.qq.com/s/G2iXsMUJBwdnUEsQKkiyGQ) #### 第五篇:服务容错 * [SA实战 ·《SpringCloud Alibaba实战》第9章-服务容错:服务雪崩与容错方案](https://mp.weixin.qq.com/s/B9YoUVdUuXtWQczP7ZVlBA) * [SA实战 ·《SpringCloud Alibaba实战》第10章-服务容错:项目整合Sentinel实现限流与容错](https://mp.weixin.qq.com/s/wBb6lUEWf9jMNm56BDpspA) * [SA实战 ·《SpringCloud Alibaba实战》第11章-服务容错加餐:Sentinel核心技术](https://mp.weixin.qq.com/s/D_YpHey3mN0YZbzGL5-RIQ) #### 第六篇:服务网关 * [SA实战 ·《SpringCloud Alibaba实战》第12章-服务网关:网关概述与核心架构](https://mp.weixin.qq.com/s/cx3crP7BnQwtCSJ7rFg8Rw) * [SA实战 ·《SpringCloud Alibaba实战》第13章-服务网关:项目整合SpringCloud Gateway网关](https://mp.weixin.qq.com/s/9HO15etc6PTZrQ1G6PuwhQ) * [SA实战 ·《SpringCloud Alibaba实战》第14章-服务网关:SpringCloud Gateway核心技术](https://mp.weixin.qq.com/s/Byz2bClG8MdNTw8ujBgvHQ) #### 第七篇:链路追踪 * [SA实战 ·《SpringCloud Alibaba实战》第15章-链路追踪:核心原理与解决方案](https://mp.weixin.qq.com/s/5NeBqv4-wGZ1A4Jsh-Qjng) * [SA实战 ·《SpringCloud Alibaba实战》第16章-链路追踪:项目整合Sleuth实现链路追踪](https://mp.weixin.qq.com/s/KRLxhqV2m6v0QmE26_UszA) * [SA实战 ·《SpringCloud Alibaba实战》第17章-链路追踪:Sleuth整合ZipKin](https://mp.weixin.qq.com/s/IJ2ijg-p8yamyfRH1Po42A) #### 第八篇:消息服务 * [SA实战 ·《SpringCloud Alibaba实战》第18章-消息服务:MQ使用场景与选型对比](https://mp.weixin.qq.com/s/jfExlVktM_LQlS0UcQrLKA) * [SA实战 ·《SpringCloud Alibaba实战》第19章-消息服务:项目整合RocketMQ](https://mp.weixin.qq.com/s/mcx4juYK3FseyFeHYJ4jLw) * [SA实战 ·《SpringCloud Alibaba实战》第20章-消息服务:RocketMQ核心技术](https://mp.weixin.qq.com/s/P3XaEqheUZg2KIdpGtvzfA) #### 第九篇:服务配置 * [SA实战 ·《SpringCloud Alibaba实战》第21章-服务配置:服务配置介绍与Nacos核心概念](https://mp.weixin.qq.com/s/T7Fv1dOiwcZnxvwS-xoovQ) * [SA实战 ·《SpringCloud Alibaba实战》第22章-服务配置:项目整合Nacos配置中心](https://mp.weixin.qq.com/s/UPRX0e5CsI1Wy2lBpfUdSA) * [SA实战 ·《SpringCloud Alibaba实战》第23章-服务配置:实现动态刷新与配置共享](https://mp.weixin.qq.com/s/D9WYSweokRxQ4EkHxuly4A) #### 第十篇:分布式事务 * [SA实战 ·《SpringCloud Alibaba实战》第24章-分布式事务:分布式事务核心原理与Seata介绍](https://mp.weixin.qq.com/s/aH5bxLe21p7aem2ze2XEpQ) * [SA实战 ·《SpringCloud Alibaba实战》第25章-分布式事务:项目整合Seata实现分布式事务](https://mp.weixin.qq.com/s/I0rOjP0nMlemzP4ZJsbPCw) #### 结束语 * [SA实战 ·《SpringCloud Alibaba实战》第26章-专栏总结与后续规划](https://mp.weixin.qq.com/s/_OwhURhLDanxgOoiWALOQA) * [十大篇章,共26个章节,332页,正式发布!!](https://mp.weixin.qq.com/s/nuK5qC0TCQ6UYlyv0Y0m8A) * [这本书又又又升级了!!](https://mp.weixin.qq.com/s/cDSwepgOHYTH5q5nYWkpfQ) #### 整合bhrpc框架 * [震撼!自己手写的100+工程的RPC框架成功整合了SpringCloud Alibaba](https://mp.weixin.qq.com/s/a2Y-qP6okUtkSKpxt8GzCg) ### 🔥🔥🔥 MySQL核心知识 * [《MySQL核心知识手册》完结,共19章,16万字,打包发布!](https://mp.weixin.qq.com/s/jd7vTSZ7wqj3BMLRlhn3DA) * [《MySQL核心知识》第1章:开篇:专栏介绍](https://mp.weixin.qq.com/s/oJyL6xz7YYDTbMT8bpz-dw) * [《MySQL核心知识》第2章:MySQL常用的命令](https://mp.weixin.qq.com/s/kIDUVms_MstfddVSxymZnQ) * [《MySQL核心知识》第3章:MySQL中的运算符](https://mp.weixin.qq.com/s/VcQIaMthW-gpjSuqyquZxQ) * [《MySQL核心知识》第4章:简单语法](https://mp.weixin.qq.com/s/UW-ZUqfDMx-Z7CjSrmnlRg) * [《MySQL核心知识》第5章:字段长度与类型宽度](https://mp.weixin.qq.com/s/JcpqhKJJr_S9I1Xj9Zj1mA) * [《MySQL核心知识》第6章:查询语句](https://mp.weixin.qq.com/s/9JsCVxExqaviOiYO_bx6cg) * [《MySQL核心知识》第7章:插入、更新、删除](https://mp.weixin.qq.com/s/_1_fiVUpFqOtb9x3jIZCAw) * [《MySQL核心知识》第8章:索引](https://mp.weixin.qq.com/s/JxLo3_lvyL1zB7GLS4jsUw) * [《MySQL核心知识》第9章:函数](https://mp.weixin.qq.com/s/MhTiA-LtqKMjw-UkfQcUNQ) * [《MySQL核心知识》第10章:自定义存储过程和函数](https://mp.weixin.qq.com/s/B5J9G23Ap7aF3F3RgzSHFA) * [《MySQL核心知识》第11章:视图](https://mp.weixin.qq.com/s/JnMc_fL8YK4HK2oy4JXRWw) * [《MySQL核心知识》第12章:触发器](https://mp.weixin.qq.com/s/b1MTNUGlyG-JMNhLHBFiNg) * [《MySQL核心知识》第13章:权限管理](https://mp.weixin.qq.com/s/9tRNKIZGIrXDah96od5DUQ) * [《MySQL核心知识》第14章:数据备份与恢复](https://mp.weixin.qq.com/s/cC5STY0Cxiyddep6qNBEVg) * [《MySQL核心知识》第15章:自动备份与恢复MySQL数据库并发送Email邮件](https://mp.weixin.qq.com/s/ckPHvJdi5yz6hdUE_xD3qA) * [《MySQL核心知识》第16章:日志](https://mp.weixin.qq.com/s/u_j7eOYm7GlwCl2qiFC8vA) * [《MySQL核心知识》第17章:性能优化](https://mp.weixin.qq.com/s/FS0plgV5a4S-g8AyGnzlww) * [《MySQL核心知识》第18章:复制](https://mp.weixin.qq.com/s/3AnCll6wopLgF7pwnvk27Q) ### 字节码编程 * [硬核 | 基于ASM实现Java类与接口的动态代理](硬核 | 基于ASM实现Java类与接口的动态代理) * [字节码编程|使用Javassist生成JavaBean](https://mp.weixin.qq.com/s?__biz=Mzg4MjU0OTM1OA==&mid=2247498302&idx=1&sn=41b853ab5a43befaede06931e14369b1&chksm=cf564d3ff821c42982ea40ae98f3f2f09c520ff38103154e9f2c7d056654393177b7247eeac3&token=2138570566&lang=zh_CN#rd) * [字节码编程|工作多年的你是否接触过这种技术?](https://mp.weixin.qq.com/s?__biz=Mzg4MjU0OTM1OA==&mid=2247498023&idx=1&sn=fc208f8ac5423f96e5a8f49589f8f4b4&chksm=cf564e26f821c730b2bab402d24852ca64dc721732031b0f8656f339c91266823951dfc46d53&token=2138570566&lang=zh_CN#rd) * [字节码编程|使用Javassist动态生成Hello World](https://mp.weixin.qq.com/s?__biz=Mzg4MjU0OTM1OA==&mid=2247498262&idx=1&sn=8a72a0d60817fd40eda57f7f47f146ee&chksm=cf564d17f821c401a2c19944091044c318104cb684c439fdf66f26128307e127325656b73359&token=2138570566&lang=zh_CN#rd) ### 面试必问系列 * [年前最后一本《面试圣经》电子书](https://mp.weixin.qq.com/s/45oLT-uRD2PDyvmKA9oUgg) * [面试必问 | 聊聊MySQL三大核心日志的实现原理?](https://mp.weixin.qq.com/s?__biz=Mzg4MjU0OTM1OA==&mid=2247498705&idx=1&sn=d62fe252758439f40490392401640c87&chksm=cf564cd0f821c5c6b67534467252ad8b24ee30a9be3dcabbe91f7a9756d15a25bdc90b358820&token=2138570566&lang=zh_CN#rd) * [面试必问 | 如何设计一款高并发的消息中间件?](https://mp.weixin.qq.com/s?__biz=Mzg4MjU0OTM1OA==&mid=2247498669&idx=1&sn=198733a3e5d2986ee578aa7b941be558&chksm=cf564cacf821c5bac83ac61f373d3b4fe0cef05e4b4b20257ae32e65f775c70f06c971458a47&token=2138570566&lang=zh_CN#rd) * [面试必问|哪些场景下Spring的事务会失效?](https://mp.weixin.qq.com/s?__biz=Mzg4MjU0OTM1OA==&mid=2247498649&idx=1&sn=ccb0343afda59782a81cdf37e292868b&chksm=cf564c98f821c58e739e36f74791495b7452a96650aaa72aa6501745c20ab806546e407c5a70&token=2138570566&lang=zh_CN#rd) * [面试必问|聊聊MyBatis执行流程?](https://mp.weixin.qq.com/s?__biz=Mzg4MjU0OTM1OA==&mid=2247498605&idx=1&sn=25484ab4786d002861454633b8a04e38&chksm=cf564c6cf821c57a4201860058c99d7617aed9851a68a2925b543c16596c688390b9646af1b4&token=2138570566&lang=zh_CN#rd) * [面试必问|聊聊JVM性能调优?](https://mp.weixin.qq.com/s?__biz=Mzg4MjU0OTM1OA==&mid=2247498381&idx=1&sn=e76f43841f9d2a6f2bf7c2369944efc5&chksm=cf564d8cf821c49add82afea08dceb02a7d59788d70d350bafd002964831b6fabb3b2629f401&token=2138570566&lang=zh_CN#rd) * [面试必问 | 聊聊Kafka的消费模型?](https://mp.weixin.qq.com/s?__biz=Mzg4MjU0OTM1OA==&mid=2247498784&idx=1&sn=e645d07e2e4e3341a628f89f93d2e7d2&chksm=cf564b21f821c23746d15b4f4cca534ea6d83105ca3ce9eac48f7e8d11bb3e37d1194fb89a67&token=2138570566&lang=zh_CN#rd) * [面试必问 | 为何数据库连接池不采用IO多路复用?](https://mp.weixin.qq.com/s?__biz=Mzg4MjU0OTM1OA==&mid=2247499378&idx=1&sn=608f17c9da784e6863e7db36ab2be414&chksm=cf564973f821c065f1a3d772ab5d84d686830a07663d9b57539c37d356a8685f2dc5e4ee15b4&token=2138570566&lang=zh_CN#rd) * [面试必问 | 一个线程从创建到消亡要经历哪些阶段?](https://mp.weixin.qq.com/s?__biz=Mzg4MjU0OTM1OA==&mid=2247500042&idx=1&sn=a6bb4c40af2edaa8f85ce72c3b562cb9&chksm=cf56760bf821ff1d065a11bbea690ccf6263ecf71b6066c4241323f953eab25a359ffec71718&token=2138570566&lang=zh_CN#rd) * [连续40场面试后的一点总结与心得(文末附面试真题](https://mp.weixin.qq.com/s/FLz3xixrOZ-mrbp23MwdWg) * [恭喜星球又一名小伙伴上岸大厂(附面试真题)](https://mp.weixin.qq.com/s/b1l2wDUIQDRkMwU830wYWA) * [面试现场 | 聊聊JVM性能调优?](https://mp.weixin.qq.com/s/w4kkd6lKtlS0Bv-S6Uptkw) ### 架构师进阶系列 * [架构师与高级开发工程师的分水岭是啥?](https://mp.weixin.qq.com/s?__biz=Mzg4MjU0OTM1OA==&mid=2247495897&idx=1&sn=a5479e4aebf18a3e4cc30c8bec807051&chksm=cf5647d8f821ceceb42deb500f4bdc4d7d28bc64f671a0da69ca955ecbf6b82b2396e42064f3&token=546222828&lang=zh_CN#rd) * [要想进大厂成为架构师需要掌握哪些硬核技能?](https://mp.weixin.qq.com/s?__biz=Mzg4MjU0OTM1OA==&mid=2247496020&idx=1&sn=aae3ea84c13d82e4027fb0381633d54b&chksm=cf564655f821cf4372325deb49ab982b4e290c73bdfba16095fae7d545a958d54a2686e1a329&token=1837443586&lang=zh_CN#rd) * [JVM整体架构与调优参数说明](https://mp.weixin.qq.com/s?__biz=Mzg4MjU0OTM1OA==&mid=2247495960&idx=1&sn=489edee13ada9612e3a92b3f624bb02a&chksm=cf564619f821cf0ff31050587a2979f9847ce1930f653f3d1d087d42346ab7625f6bcb90ed39&token=1837443586&lang=zh_CN#rd) * [逃逸分析,yyds!!](https://mp.weixin.qq.com/s?__biz=Mzg4MjU0OTM1OA==&mid=2247497051&idx=1&sn=38312719e86b7f9cef31c00102da6c48&chksm=cf56425af821cb4c779873fdd793a73af274305f74de7ae72a06fcd2d9224f44541f4a2cad40&token=1170633251&lang=zh_CN#rd) * [熔断,限流,降级,容错,服务高可用(建议收藏)](https://mp.weixin.qq.com/s?__biz=Mzg4MjU0OTM1OA==&mid=2247498714&idx=1&sn=41d6ea250b3d3bec192be68abd2358a0&chksm=cf564cdbf821c5cd28edda7ef364c25860505e9172be3b75a1ac952a69df901ee512047b84e6&token=2138570566&lang=zh_CN#rd) * [高并发,高性能,高可用,高扩展,分布式,微服务,字节码(建议收藏)](https://mp.weixin.qq.com/s?__biz=Mzg4MjU0OTM1OA==&mid=2247498735&idx=1&sn=2639176344c13e9848e97824555aefdc&chksm=cf564ceef821c5f8798e633cd3f9921f0eeb91c3dc8a558b584b95a5468990d7ef9bc062cf2d&token=2138570566&lang=zh_CN#rd) * [定时轮询,DelayQueue,ZSet,TTL+DLX(建议收藏)](https://mp.weixin.qq.com/s?__biz=Mzg4MjU0OTM1OA==&mid=2247498767&idx=1&sn=9a4a0cc594ffd796bafc48992feffafb&chksm=cf564b0ef821c2187b47a78bf457a9152496a4ce10e70114e6f87f85aeab036a589223598c11&token=2138570566&lang=zh_CN#rd) * [服务端高并发分布式架构演进之路(建议收藏)](https://mp.weixin.qq.com/s?__biz=Mzg4MjU0OTM1OA==&mid=2247498861&idx=1&sn=629c0a78e5e9bf00a1ee34fa64efda80&chksm=cf564b6cf821c27a4d4aafa13bd6d60c9a923d2984bc98ca0163326bb77196375b22400fd6c4&token=2138570566&lang=zh_CN#rd) * [高可用架构(建议收藏)](https://mp.weixin.qq.com/s?__biz=Mzg4MjU0OTM1OA==&mid=2247498923&idx=1&sn=536517b094338c6a17e4e259d1b1edd7&chksm=cf564baaf821c2bc339dac8b350d039531bd2a82aeed051a7c76ceb960b99448b0dceff9b70c&token=2138570566&lang=zh_CN#rd) * [高性能架构(建议收藏)](https://mp.weixin.qq.com/s?__biz=Mzg4MjU0OTM1OA==&mid=2247498925&idx=1&sn=575dc293c689c9313fbc0da4739a2e4e&chksm=cf564bacf821c2ba95055c28c33e6ce9373e5b297528507c91f6d1507d69a87dc88447408fa5&token=2138570566&lang=zh_CN#rd) * [在大厂担任架构师需要掌握哪些核心技能?(建议收藏)](https://mp.weixin.qq.com/s?__biz=Mzg4MjU0OTM1OA==&mid=2247498939&idx=1&sn=3d4920bdfd9a6df6b7a3feb28a77f0d0&chksm=cf564bbaf821c2acac70e488f1a4fc8b009d9323a3717fa651463e0650fca0d06682d8555c3b&token=2138570566&lang=zh_CN#rd) * [不同业务场景该如何选择缓存的读写策略?](https://mp.weixin.qq.com/s?__biz=Mzg4MjU0OTM1OA==&mid=2247500047&idx=1&sn=17dcb702ad1533bb77861e467a4aebe2&chksm=cf56760ef821ff1820387097baa8d67678ed19df172f3622dbc5b7ab1ccc7bde54301d83dc11&token=2138570566&lang=zh_CN#rd) * [复盘:我在真实场景下对几款主流云原生数据库进行极限性能压测的一次总结!!(建议收藏)](https://mp.weixin.qq.com/s/AY3mlwrPGD5hbob0Ja73TA) ### 分布式事务系列 * [面试分布式事务必问知识点!](https://mp.weixin.qq.com/s?__biz=Mzg4MjU0OTM1OA==&mid=2247497675&idx=1&sn=ff56c55baa17f90d9bcf2bb5478771ab&chksm=cf5640caf821c9dce560d6b7ea619b4cbbb9fc082c4858131d1dc846a5275b9214e579d79a43&token=1170633251&lang=zh_CN#rd) * [tcc-transaction分布式TCC型事务框架搭建与实战案例(基于Dubbo/Dubbox)](https://mp.weixin.qq.com/s?__biz=Mzg4MjU0OTM1OA==&mid=2247489327&idx=1&sn=d7a1bd61e6403d2adf8c393dd43ede4d&chksm=cf55a02ef82229389c92eed4047980df8dddbc0cff0573f5e7d701aed8e58c9fd86d1c9345c0&token=776659970&lang=zh_CN#rd) * [如何基于消息中间件实现分布式事务?万字长文给你答案!!](https://mp.weixin.qq.com/s?__biz=Mzg4MjU0OTM1OA==&mid=2247489296&idx=1&sn=b14e1a2636038996785330afd7ac4690&chksm=cf55a011f822290717d8c33588bb9f19c3d3bbd6606bd9d5d262cdb7ee86db75f9022369b2c0&token=776659970&lang=zh_CN#rd) * [面试官问我:如何恢复MySQL中的XA事务?](https://mp.weixin.qq.com/s?__biz=Mzg4MjU0OTM1OA==&mid=2247489293&idx=1&sn=481df6b2ec975b7d5065a30306c99ff9&chksm=cf55a00cf822291a6cbaa63450bce39150bc6515a86733b455a3079b898e4242de4caef39346&token=776659970&lang=zh_CN#rd) * [12张图带你彻底理解分布式事务产生的场景和解决方案!!](https://mp.weixin.qq.com/s?__biz=Mzg4MjU0OTM1OA==&mid=2247489572&idx=1&sn=283f24189f3add773ffa2f8a314ac8d1&chksm=cf55af25f822263342c3d67490d803b1de9a7199ee765c8614f7dda90441480877d2af02c8ef&token=776659970&lang=zh_CN#rd) * [如何基于消息中间件实现分布式事务?我想说的都在这儿了!!](https://mp.weixin.qq.com/s?__biz=Mzg4MjU0OTM1OA==&mid=2247489149&idx=1&sn=6d82218e24a08b56b1ae16230a313b4e&chksm=cf55a17cf822286a31a09299d293c96c8f41f37350b2e4fe9cdd6706820321b163e7a3e4232b&token=776659970&lang=zh_CN#rd) * [tcc-transaction分布式TCC型事务框架搭建与实战案例(基于Dubbo/Dubbox)](https://mp.weixin.qq.com/s?__biz=Mzg4MjU0OTM1OA==&mid=2247489458&idx=1&sn=3f9997b178148c0152110e888aac45bc&chksm=cf55a0b3f82229a59dcc4754dd5b4cc355b960c44aaeeb22785da96b6f832cf9e42842d1d117&token=776659970&lang=zh_CN#rd) ### 精通JVM系列 * [爆肝 | 一文彻底吃透JVM系列](https://mp.weixin.qq.com/s/GAjZbMs9TQjM6zb91zfDrA) * [逃逸分析,yyds!!](https://mp.weixin.qq.com/s?__biz=Mzg4MjU0OTM1OA==&mid=2247497051&idx=1&sn=38312719e86b7f9cef31c00102da6c48&chksm=cf56425af821cb4c779873fdd793a73af274305f74de7ae72a06fcd2d9224f44541f4a2cad40&token=1170633251&lang=zh_CN#rd) * [关于JVM,你需要掌握这些!!](https://mp.weixin.qq.com/s?__biz=Mzg4MjU0OTM1OA==&mid=2247489318&idx=1&sn=bd0b6912601dde225cb9212971a41de6&chksm=cf55a027f82229315a3fa594538c8f4b7dc2300d0480e684070d54f5f5e1896533fe1cdb6c47&token=776659970&lang=zh_CN#rd) * [肝了一周,吐血整理出这份超硬核的JVM笔记(升级版)!!](https://mp.weixin.qq.com/s?__biz=Mzg4MjU0OTM1OA==&mid=2247489122&idx=1&sn=760a751b5d0f927a39051c39f03c19da&chksm=cf55a163f8222875a1422bd2b5550f1c91ebface3773f171b259d60937e17b8514231b8f3069&token=776659970&lang=zh_CN#rd) * [JVM 内存布局(建议收藏)](https://mp.weixin.qq.com/s?__biz=Mzg4MjU0OTM1OA==&mid=2247498909&idx=1&sn=67a21df964f45f10cbdd669a97a5cc07&chksm=cf564b9cf821c28a09876886f69c47c0aed8086dc124d6cbae9b0c9c89764cea81f5ae1a70ed&token=2138570566&lang=zh_CN#rd) * [JVM调优的几种场景(建议收藏)](https://mp.weixin.qq.com/s?__biz=Mzg4MjU0OTM1OA==&mid=2247499173&idx=1&sn=b4ef46c3d2bac53bf5af44acfe3a76c6&chksm=cf564aa4f821c3b291e83b2e48995b5f4c0564156b2ea4e5900fac30f7bbef7d09d64e507e18&token=2138570566&lang=zh_CN#rd) * [类的编译、编译优化、常量这地、泛型擦除、条件编译(建议收藏)](https://mp.weixin.qq.com/s?__biz=Mzg4MjU0OTM1OA==&mid=2247499246&idx=1&sn=4bb2af5be2bfe2ea19fa2c899781b894&chksm=cf564aeff821c3f93e4491f5cf0eef70473c3e337acb1540162c9bdb86a5709f04f59ec7b729&token=2138570566&lang=zh_CN#rd) * [类的加载过程(建议收藏)](https://mp.weixin.qq.com/s?__biz=Mzg4MjU0OTM1OA==&mid=2247499270&idx=1&sn=c4b90d564ffd71fe098f34c75d9a3d77&chksm=cf564907f821c01106be971b821e83fcfbcd5e7b8e510d4582fd0618a2f9dc5a0ecff73219a6&token=2138570566&lang=zh_CN#rd) * [JVM内存空间(建议收藏)](https://mp.weixin.qq.com/s?__biz=Mzg4MjU0OTM1OA==&mid=2247499304&idx=1&sn=13e15f0a0e8999ac28e83f0e90e7e938&chksm=cf564929f821c03f3f81b53a7c11499865c9be39bd1ebd835061c4e7a1744ef89bbefbb48fee&token=2138570566&lang=zh_CN#rd) * [JVM堆内存分配机制(建议收藏)](https://mp.weixin.qq.com/s?__biz=Mzg4MjU0OTM1OA==&mid=2247499346&idx=1&sn=fd0b892c55177cc78cc69be0ff7e84c1&chksm=cf564953f821c0454f31bd284748b26ae2aec1795cbf62625fddc64fae99c4687795de3d066e&token=2138570566&lang=zh_CN#rd) * [JVM垃圾回收机制(建议收藏)](https://mp.weixin.qq.com/s?__biz=Mzg4MjU0OTM1OA==&mid=2247499370&idx=1&sn=9584ccbeb437823a59f4a6af7058d4a8&chksm=cf56496bf821c07dc09d199169e0a3057bf3a400420837594dc37d0cc2edfab5ca6fd82b041d&token=2138570566&lang=zh_CN#rd) * [JVM垃圾回收算法(建议收藏)](https://mp.weixin.qq.com/s?__biz=Mzg4MjU0OTM1OA==&mid=2247499394&idx=1&sn=7d7ab984a059d8867a1c8dad5d919a26&chksm=cf564983f821c095d733af641e82dc3e7a23c22c559eb6e07fd52ec9fe3b80b68dbea3cf70d7&token=2138570566&lang=zh_CN#rd) * [JVM - CMS垃圾收集器(建议收藏)](https://mp.weixin.qq.com/s?__biz=Mzg4MjU0OTM1OA==&mid=2247499410&idx=1&sn=787037527afd5762e407626d010a7589&chksm=cf564993f821c085f481e96730ab0c0ff59db0dfa6630e72e3fcfd5ef2656e5f3bfd2b56253e&token=2138570566&lang=zh_CN#rd) * [JVM - G1收集器、Region、停顿时间模型、垃圾回收(建议收藏)](https://mp.weixin.qq.com/s?__biz=Mzg4MjU0OTM1OA==&mid=2247499418&idx=1&sn=d389ba1e25dbf315ddbf92775e4c43b6&chksm=cf56499bf821c08dd1b0d8758169c58a5539ce3d405334d2dc70207f01f9f864fb6093336717&token=2138570566&lang=zh_CN#rd) ### 视频号系列 * [冰河真人出镜聊职业发展!!](https://mp.weixin.qq.com/s?__biz=Mzg4MjU0OTM1OA==&mid=2247497995&idx=1&sn=0a06ee6e4e75f57e7bffd4feb2ae7013&chksm=cf564e0af821c71c43703032c78ae398d78ec19513f9fc0cceafbb9906d0108e038eb11eaf51&token=2138570566&lang=zh_CN#rd) * [冰河第一次真人出镜!](https://mp.weixin.qq.com/s?__biz=Mzg4MjU0OTM1OA==&mid=2247497792&idx=1&sn=aeae1755bd4f63dc2821695c9ff4e1b0&chksm=cf564f41f821c6576b100425d1d8e5a39273a22383b2cc90716d9e3b0650fbc021b3e85baa42&token=1170633251&lang=zh_CN#rd) * [不费脑子学习MySQL体系架构,yyds!!](https://mp.weixin.qq.com/s?__biz=Mzg4MjU0OTM1OA==&mid=2247496106&idx=1&sn=d25ad3830ce89ee32d46e646eefd9e4f&chksm=cf5646abf821cfbd6251c7886d0b359d263833f912e6bba97531cc001ba7a61f86d2ea46fed9&token=1837443586&lang=zh_CN#rd) * [高铁站露天等车回成都,一个字:冷!](https://mp.weixin.qq.com/s?__biz=Mzg4MjU0OTM1OA==&mid=2247499381&idx=1&sn=2e0e09054db3e376006a3ef8259a413d&chksm=cf564974f821c0627753eba86bffaa16278f3e75865afc32540a3469d3b2d3b62d39375fc7fa&token=2138570566&lang=zh_CN#rd) * [冰河做客墨天轮直播-DBA以及运维岗如何更好的规划职业生涯?](https://mp.weixin.qq.com/s?__biz=Mzg4MjU0OTM1OA==&mid=2247500484&idx=1&sn=02b2f754e13617744d6413bf1c4adc9e&chksm=cf5675c5f821fcd3f7bad0b881a7d269430ebaf5dea01c7c1aa8ca395055ab2b79e9b6bc4905&token=2138570566&lang=zh_CN#rd) * [冰河直播-现场实战手写Java线程池源码](https://mp.weixin.qq.com/s?__biz=Mzg4MjU0OTM1OA==&mid=2247500491&idx=1&sn=bce5b1b513d375cbe4a8635eff9b01c1&chksm=cf5675caf821fcdc7b4b2eeea3a54c2bfb4c2c246b4abe18b4ef0b3d5a107285c225877f2976&token=2138570566&lang=zh_CN#rd) * [技术人如何突破职场瓶颈?](https://mp.weixin.qq.com/s?__biz=Mzg4MjU0OTM1OA==&mid=2247500604&idx=1&sn=f105d1804929d755ae5eae5cc211cef4&chksm=cf56743df821fd2b2e2facc453039f6629dee41a4132274173335158cd6f3a972ad8081ce1f4&token=2138570566&lang=zh_CN#rd) * [搞不懂MVCC,学个毛的MySQL](https://mp.weixin.qq.com/s?__biz=Mzg4MjU0OTM1OA==&mid=2247501045&idx=1&sn=99b4bf092187cb306600bf48de33b50d&chksm=cf5673f4f821fae2f956c5a331320a54ea0c1cb723ee37daa1efbb8b6d85d80f68f73f660c2b&token=2138570566&lang=zh_CN#rd) ### 性能调优系列 * [系统性能优化的关键指标(yyds)!!!](https://mp.weixin.qq.com/s?__biz=Mzg4MjU0OTM1OA==&mid=2247495332&idx=1&sn=6fcaf156fae6419a03ede479a30eede4&chksm=cf5659a5f821d0b37d8067bf868fb60780e5794e4220312e55c5f5e9bd3994c72b724f2f379d&token=1081687632&lang=zh_CN#rd) * [万万没想到!Linux环境下Tomcat8启动很慢竟然是这样解决的!!](https://mp.weixin.qq.com/s?__biz=Mzg4MjU0OTM1OA==&mid=2247489431&idx=1&sn=b6b4a16f2f2d2e2368e92af7a0f0e7f1&chksm=cf55a096f82229805105038bfbf9d1509d56dc4edd425bbd2d95e1f2958b339f9a7d4fbaf95a&token=776659970&lang=zh_CN#rd) * [几种典型的内存溢出案例,都在这儿了!](https://mp.weixin.qq.com/s?__biz=Mzg4MjU0OTM1OA==&mid=2247489320&idx=1&sn=5b4e3327c01c9a9b5a44701e2861abfa&chksm=cf55a029f822293f350d5d51d29a6c947422e032218b4c519df57c7b3c7e4d949eae15844959&token=776659970&lang=zh_CN#rd) * [性能优化有哪些衡量指标?需要注意什么?](https://mp.weixin.qq.com/s?__biz=Mzg4MjU0OTM1OA==&mid=2247489189&idx=1&sn=e4a756f789f6840f5d1a3e006afe21b6&chksm=cf55a1a4f82228b2a6d271ecbfc01ea363098ae656356cf69a6b1b92cd5b74b780b6d8e95de8&token=776659970&lang=zh_CN#rd) * [小伙伴问我性能指标监控怎么做,这次我安排上了!!](https://mp.weixin.qq.com/s?__biz=Mzg4MjU0OTM1OA==&mid=2247489187&idx=1&sn=bdb408cd594e9ba4c2bf625c0bd633f1&chksm=cf55a1a2f82228b4f6c6a25b164fa303e1e5c6e6df22e88c15ce22ebb9720fc11a6bff6ed89c&token=776659970&lang=zh_CN#rd) * [面试官:Java中的对象和数组都是在堆上分配的吗?](https://mp.weixin.qq.com/s?__biz=Mzg4MjU0OTM1OA==&mid=2247489185&idx=1&sn=63186214b5145a5f6567d9bae6fd34e6&chksm=cf55a1a0f82228b6ec127d7ece0b1343c970edc17861d4db6b361bd16da863676082ea43d4fe&token=776659970&lang=zh_CN#rd) * [你敢信?String类竟然是导致生产环境频繁内存溢出的罪魁祸首!!](https://mp.weixin.qq.com/s?__biz=Mzg4MjU0OTM1OA==&mid=2247489152&idx=1&sn=ea43c3d09d42e780dcc26d35c6f9cd86&chksm=cf55a181f82228971b193d73fd324963db377c3912fd6410bc91ca5901146681579881a8b971&token=776659970&lang=zh_CN#rd) * [Java中的String到底占用多大的内存空间?你所了解的可能都是错误的!!](https://mp.weixin.qq.com/s?__biz=Mzg4MjU0OTM1OA==&mid=2247489150&idx=1&sn=ad7377d24f5e0e7769d1d9753b078694&chksm=cf55a17ff8222869f335ef3367331bc4e187bcd2b82ec1a541bced63d46e48b2f82fcb46b05d&token=776659970&lang=zh_CN#rd) * [做过性能优化的工作吗?你会从哪些方面入手做性能优化呢?](https://mp.weixin.qq.com/s?__biz=Mzg4MjU0OTM1OA==&mid=2247489134&idx=1&sn=fa59dc70f8060a4670bdc5a2630e701f&chksm=cf55a16ff82228791e033427d673895e9a1699ac67983b18da6735a261d58a4695aa97163423&token=776659970&lang=zh_CN#rd) * [纳尼?内存又溢出了?!是时候总结一波了!!](https://mp.weixin.qq.com/s?__biz=Mzg4MjU0OTM1OA==&mid=2247489132&idx=1&sn=505e80f3a362ade361e90f1eaab22b2e&chksm=cf55a16df822287b64021b550ffe8bfe00ccb3e315568a5bb5f1973b0d22c045c0ba95113344&token=776659970&lang=zh_CN#rd) ### 一起进大厂系列 * [报告:冰河撸了本《面试圣经》电子书!!](https://mp.weixin.qq.com/s?__biz=Mzg4MjU0OTM1OA==&mid=2247495500&idx=1&sn=bf750985622fdc84c4191ecfc6d3c964&chksm=cf56584df821d15b5f322e567d34f6aaa01c791f2c287f6ccb8cf751e0c64dae6387c1f0b620&token=1081687632&lang=zh_CN#rd) * [冰河与你聊聊大厂更加看重哪些能力?(建议收藏)](https://mp.weixin.qq.com/s?__biz=Mzg4MjU0OTM1OA==&mid=2247499634&idx=1&sn=c667203b582caaba87b139b2f10cfb57&chksm=cf564873f821c16565a860cbb36f3953f137c7955e6c1e8a0e9a29c417a055480d9c6fad3e99&token=2138570566&lang=zh_CN#rd) * [明明自我感觉良好,可为啥就是进不了大厂呢?](https://mp.weixin.qq.com/s?__biz=Mzg4MjU0OTM1OA==&mid=2247494985&idx=1&sn=208f18e8affe77360a542bcec777d690&chksm=cf565a48f821d35ee69d51ad08cad7e09c15387a2b3a9981369b3037d8817efb32f7f0f7df6a&token=1081687632&lang=zh_CN#rd) * [头条一面:Spring IOC容器中只存放单例Bean吗?](https://mp.weixin.qq.com/s?__biz=Mzg4MjU0OTM1OA==&mid=2247492924&idx=1&sn=39cf807fb5aa26b73d16cb490197a569&chksm=cf56523df821db2b665fa07c3c02c2c966dc25165c09cd555447597dae05ec4a20968d67e709&token=1352513703&lang=zh_CN#rd) * [热乎乎的面经:Spring中Scheduled和Async两种调度方式有啥区别?](https://mp.weixin.qq.com/s?__biz=Mzg4MjU0OTM1OA==&mid=2247492889&idx=1&sn=7b33ecf05cef968930399b46d147c193&chksm=cf565218f821db0ebd23154d38e689c44dd9f5bad3390f41036a3c0f4db1a52cb7644254fee6&token=1352513703&lang=zh_CN#rd) * [进大厂,到底需要掌握哪些核心技能?](https://mp.weixin.qq.com/s?__biz=Mzg4MjU0OTM1OA==&mid=2247492877&idx=1&sn=1624a1b0423de3857b28f852deae05f3&chksm=cf56520cf821db1a0ed3baa708f93c4f1d68d9f5d18507ac126bd1b9547e3418f56a31557712&token=1352513703&lang=zh_CN#rd) * [面试官:如果让你设计一个高并发的消息中间件,你会怎么做?](https://mp.weixin.qq.com/s?__biz=Mzg4MjU0OTM1OA==&mid=2247490267&idx=1&sn=2181dd10bc930f49b3bf5a44d62fcabc&chksm=cf55addaf82224cc42086018b6d77a8a17586ec414a324ec8473745150d60b7d1f14c626e2b8&token=776659970&lang=zh_CN#rd) * [同时拿到BATJMD的Offer是怎样的一种体验?](https://mp.weixin.qq.com/s?__biz=Mzg4MjU0OTM1OA==&mid=2247490503&idx=1&sn=3b2ad39a88ec09eeb1dc17d1ce259e7b&chksm=cf55acc6f82225d0cf074310020a2fc2d4ec85ebf44524bad96d2b239aa1fd11cc852c5f1353&token=776659970&lang=zh_CN#rd) * [即使技术再精,面试时一问这个必挂!!](https://mp.weixin.qq.com/s?__biz=Mzg4MjU0OTM1OA==&mid=2247490521&idx=1&sn=da5bdb9e9079625a82cd843cbe4d136b&chksm=cf55acd8f82225ce9ef57ea0a2c7076eb267d65d02d703277cc254c35fa73e51d235544d7e2c&token=776659970&lang=zh_CN#rd) * [字节跳动面试官这样问有关字符串的问题!!](https://mp.weixin.qq.com/s?__biz=Mzg4MjU0OTM1OA==&mid=2247490675&idx=1&sn=eb63db12f3004dcde676224145128e50&chksm=cf55ab72f82222646f91b4c4c4f6b76b76b672f4f62e2a2916c29a9009d6e2473fadc1d703a9&token=776659970&lang=zh_CN#rd) * [为何你进不了大厂?](https://mp.weixin.qq.com/s?__biz=Mzg4MjU0OTM1OA==&mid=2247490748&idx=1&sn=32ab2d8522b9f47f53d77b4880e15c87&chksm=cf55abbdf82222ab5267e0fe0cb1df953e8ee2f74cd18b1f7d01449611ce647ba880d47d1687&token=776659970&lang=zh_CN#rd) * [冰河公开了进大厂必备的核心技能,服了!](https://mp.weixin.qq.com/s?__biz=Mzg4MjU0OTM1OA==&mid=2247490771&idx=1&sn=7a8870e3118117d5ae50b5fe94eda0a4&chksm=cf55abd2f82222c435213ee8d3865a44fcbe6c548a38db2f4c45620afd5ecf0a2cc22f02532d&token=776659970&lang=zh_CN#rd) * [金三银四,冰河为你整理了这份20万字134页的面试圣经!!](https://mp.weixin.qq.com/s?__biz=Mzg4MjU0OTM1OA==&mid=2247490906&idx=1&sn=c01aec12b225f9e0e69cba2c73b5e842&chksm=cf55aa5bf822234dcc319e42b803250bfd5ebb2d2b7db951b2409638252533c1d7cd1ae0416f&token=776659970&lang=zh_CN#rd) * [面试官灵魂拷问:if语句执行完else语句真的不会再执行吗?](https://mp.weixin.qq.com/s?__biz=Mzg4MjU0OTM1OA==&mid=2247489321&idx=1&sn=2e44884acb0de536dc905ac2de0a74f8&chksm=cf55a028f822293e67d545942c473014fc9adadab3af793c1229737af6ada254aefd3d75e9ec&token=776659970&lang=zh_CN#rd) * [将 95% 求职者拒之门外的BAT大数据面试题-附解题方法(文末有福利)](https://mp.weixin.qq.com/s?__biz=Mzg4MjU0OTM1OA==&mid=2247489239&idx=1&sn=2c7c85b1145e59a39fba173398254337&chksm=cf55a1d6f82228c044066ff1910b1425719c06c3a0f3d25d7723f27b101e8c636efde4a75220&token=776659970&lang=zh_CN#rd) * [超硬核面经,已拿蚂蚁金服Offer!!](https://mp.weixin.qq.com/s?__biz=Mzg4MjU0OTM1OA==&mid=2247489230&idx=1&sn=9b4e0b7b9f49f298e681be3dce108d80&chksm=cf55a1cff82228d9f3369d2706fda98eda41d093827819816463cc4f540ea9954fa31a1d7039&token=776659970&lang=zh_CN#rd) * [面试官问我:数据库中事务的隔离级别有哪些?各自有什么特点?然而。。。](https://mp.weixin.qq.com/s?__biz=Mzg4MjU0OTM1OA==&mid=2247489220&idx=1&sn=c939b92646054d25d3c8bc9146e3f7ff&chksm=cf55a1c5f82228d3d2c61f289f312bdb31a0f81475112f83413f41d1fe0f51535af986215239&token=776659970&lang=zh_CN#rd) * [面试官:讲讲什么是缓存穿透?击穿?雪崩?如何解决?](https://mp.weixin.qq.com/s?__biz=Mzg4MjU0OTM1OA==&mid=2247489193&idx=1&sn=6d8912a62f5fd09c85a3d44a189e7ef9&chksm=cf55a1a8f82228be7c76699df36377e57347249e1d15b48bad9ef15f2497e9254a36b8d24bad&token=776659970&lang=zh_CN#rd) * [面试官:Java中提供了synchronized,为什么还要提供Lock呢?](https://mp.weixin.qq.com/s?__biz=Mzg4MjU0OTM1OA==&mid=2247489191&idx=1&sn=77c3ac60313a2e54e4cb719264f02cb4&chksm=cf55a1a6f82228b0094495740752f02e0614ba35bf543ad06b89873f6bce461f9a5d46b485a6&token=776659970&lang=zh_CN#rd) * [面试官:说说缓存最关心的问题?有哪些类型?回收策略和算法?](https://mp.weixin.qq.com/s?__biz=Mzg4MjU0OTM1OA==&mid=2247489190&idx=1&sn=f47f87c1784664a23eff41b5bbc854d7&chksm=cf55a1a7f82228b1dac0635e31fde41f63cb884456d63422dc845cbe3e7982a2868df744d6a1&token=776659970&lang=zh_CN#rd) * [面试官:讲讲Redis的五大数据类型?如何使用?(内含完整测试源码)](https://mp.weixin.qq.com/s?__biz=Mzg4MjU0OTM1OA==&mid=2247489181&idx=1&sn=aa1928a2d4d6280aa961bfca16da3afb&chksm=cf55a19cf822288a9af29aba1ae321ce74ae0342217270a630dee94503d5bf47b44a082aa0de&token=776659970&lang=zh_CN#rd) * [今天又一名读者斩获蚂蚁金服Offer,就是这么简单!!](https://mp.weixin.qq.com/s?__biz=Mzg4MjU0OTM1OA==&mid=2247489148&idx=1&sn=76b7af63c8fdee5eaecd0a09c5988674&chksm=cf55a17df822286b46a67510ccce04c12c0eba5de2ac473213669f48feb5cd63ba282ab4713f&token=776659970&lang=zh_CN#rd) * [面试官:讲讲七层网络模型与TCP三次握手与四次断开?](https://mp.weixin.qq.com/s?__biz=Mzg4MjU0OTM1OA==&mid=2247489146&idx=1&sn=44d502461973c72a418d5597fc9c434b&chksm=cf55a17bf822286d5daf2f7fea9832254d34830df067ef2fff588eadf68151b3e79097a8613f&token=776659970&lang=zh_CN#rd) * [面试官问我:如果让你设计一个高并发的消息中间件,你会怎么做?](https://mp.weixin.qq.com/s?__biz=Mzg4MjU0OTM1OA==&mid=2247489136&idx=1&sn=2fd7d6136c7075cfc38d6286c3ef074c&chksm=cf55a171f8222867d407c6f34ff350c3879fa04f28271bb9525fcdbfe103e6ca58a73bf20733&token=776659970&lang=zh_CN#rd) * [面试官:讲讲类的加载、链接和初始化?](https://mp.weixin.qq.com/s?__biz=Mzg4MjU0OTM1OA==&mid=2247489135&idx=1&sn=321c7121c31cb6cc327bdfc525b88743&chksm=cf55a16ef82228784a74be63c0068875012abf0b2cf969481a86386c9cd3e0fc6e18911500b6&token=776659970&lang=zh_CN#rd) * [面试官:做过性能优化的工作吗?你会从哪些方面入手做性能优化呢?](https://mp.weixin.qq.com/s?__biz=Mzg4MjU0OTM1OA==&mid=2247489134&idx=1&sn=fa59dc70f8060a4670bdc5a2630e701f&chksm=cf55a16ff82228791e033427d673895e9a1699ac67983b18da6735a261d58a4695aa97163423&token=776659970&lang=zh_CN#rd) * [面试官:如何以最高的效率从MySQL中随机查询一条记录?](https://mp.weixin.qq.com/s?__biz=Mzg4MjU0OTM1OA==&mid=2247489133&idx=1&sn=434ca533aee95e2a0e4307f4f5f48345&chksm=cf55a16cf822287af928ff88aed8826a5e3bad5fb51744a539e29733f867ad3abb48cbcd6948&token=776659970&lang=zh_CN#rd) * [面试官:讲讲Redis的虚拟内存?](https://mp.weixin.qq.com/s?__biz=Mzg4MjU0OTM1OA==&mid=2247489131&idx=1&sn=3d3ae1ddcb6f22125b25de0b8e54a5f2&chksm=cf55a16af822287c40c3e74de15a714b816a2a3ffd9c1ca40e2a70937f817bf6622e810497a2&token=776659970&lang=zh_CN#rd) ### 互联网工程 * [冰河亲自整理的Git命令汇总(建议收藏)](https://mp.weixin.qq.com/s?__biz=Mzg4MjU0OTM1OA==&mid=2247497828&idx=1&sn=ec7fabe90450e28f647d2ad964e89bd7&chksm=cf564f65f821c673febc888dfc457e4ac145b36006a7a88cb265ac6b500a71d7f414809172d3&token=2138570566&lang=zh_CN#rd) * [十一长假我肝了这本超硬核PDF,现决定开源!!](https://mp.weixin.qq.com/s?__biz=Mzg4MjU0OTM1OA==&mid=2247489163&idx=1&sn=38a317b2e6de99c4f10b5a80df153bd4&chksm=cf55a18af822289c6595fc37674bf1f9188521c1de760cd8cf4801bc716aa974847d79509e24&token=776659970&lang=zh_CN#rd) * [为什么互联网巨头们纷纷使用Git而放弃SVN?(内含Git核心命令与原理总结)](https://mp.weixin.qq.com/s?__biz=Mzg4MjU0OTM1OA==&mid=2247489192&idx=1&sn=1f793496fa2d6eb75559c9eedb4fece5&chksm=cf55a1a9f82228bf9828af75a8765f5cd76baf0beb2fc8078d4994ec9373e856f5f157af50b8&token=776659970&lang=zh_CN#rd) * [我搭建了一套企业级私有Git服务,抗住了每天上万次攻击!](https://mp.weixin.qq.com/s?__biz=Mzg4MjU0OTM1OA==&mid=2247489186&idx=1&sn=fe9e55a4f6d78b5ffc95a166070196d3&chksm=cf55a1a3f82228b590f4aa712435acf65d7340ba762a0f840890ad11db804b3692d94fd80a48&token=776659970&lang=zh_CN#rd) * [使用Maven那么久了,你对企业级Maven的核心配置了解多少?](https://mp.weixin.qq.com/s?__biz=Mzg4MjU0OTM1OA==&mid=2247489169&idx=1&sn=496d477dc668c04c2e84f5f918bf1ebb&chksm=cf55a190f82228860b6516857486b23107d895be452f35da3abeba4ef1acd05805aba5a17c29&token=776659970&lang=zh_CN#rd) * [面试官:开发过Maven插件吗?我:开发过啊。。。](https://mp.weixin.qq.com/s?__biz=Mzg4MjU0OTM1OA==&mid=2247489168&idx=1&sn=17983641eb3efc819e8d8314d1b7ecce&chksm=cf55a191f822288727c44cb457c90749494492e5626a71bc23e193b20c44838f6c19c9dce9a5&token=776659970&lang=zh_CN#rd) * [小伙伴问我:如何搭建Maven私服?我连夜肝了这篇实战文章!!](https://mp.weixin.qq.com/s?__biz=Mzg4MjU0OTM1OA==&mid=2247489167&idx=1&sn=86369b6a6e81c45ca2fb59f8f379e39c&chksm=cf55a18ef822289896820df6da422125a2f718433da8258f5cadec2122f6e64e1b38a81a5a98&token=776659970&lang=zh_CN#rd) * [小伙伴想学Jenkins自动构建发布项目,我:安排上了!!](https://mp.weixin.qq.com/s?__biz=Mzg4MjU0OTM1OA==&mid=2247489166&idx=1&sn=072f0a5324e99f3852ccbfc1b687208c&chksm=cf55a18ff82228994255c4e9abc4edee08cdb7e0343fc408eb7e415c02baedabdf9d13c3a9e0&token=776659970&lang=zh_CN#rd) * [搭建Maven私服(建议收藏)!!](https://mp.weixin.qq.com/s?__biz=Mzg4MjU0OTM1OA==&mid=2247497927&idx=1&sn=929bb03d1ffc8ec8cd4c38f56c7d7497&chksm=cf564fc6f821c6d0943f1bcfcd86724d1c3afc89249870ddc93fc450aa9020df89a81a15c5f5&token=2138570566&lang=zh_CN#rd) ### 精通渗透系列 * [国庆第一天冰河在车上又肝了一本电子书!](https://mp.weixin.qq.com/s?__biz=Mzg4MjU0OTM1OA==&mid=2247496713&idx=1&sn=d81f0ee9625ee041ebfb0390ba35fbda&chksm=cf564308f821ca1e190ce11cee1725dbd3f1753f27c0f7f266d8633605318af8ee7759422f09&token=1170633251&lang=zh_CN#rd) * [黑客常用SQL注入绕过技术总结!](https://mp.weixin.qq.com/s?__biz=Mzg4MjU0OTM1OA==&mid=2247494362&idx=1&sn=dcc2b6e9722afac2963929800578bbc3&chksm=cf565ddbf821d4cdc5551c60dadbd1ee87facb24844eaa994eb58a87e31477fe56a8c133bc74&token=1772864567&lang=zh_CN#rd) * [卧槽,冰河又写了一本电子书!!](https://mp.weixin.qq.com/s?__biz=Mzg4MjU0OTM1OA==&mid=2247494015&idx=1&sn=b4e5c7fb01d6c6619e027a4546b72df2&chksm=cf565e7ef821d768ea6fd3bf3b9369df8934358838c9b12a662ad57c335ef8d5c6ce3b4ae690&token=1929495520&lang=zh_CN#rd) * [《冰河的渗透实战笔记》电子书正式发布!!](https://mp.weixin.qq.com/s?__biz=Mzg4MjU0OTM1OA==&mid=2247493194&idx=1&sn=62e3c40b2cf963c1a0b0bc531465249c&chksm=cf56514bf821d85d5f56f437de1911724b89efc324e334fb26c5e0b7fc1ea82f99a0da289ba8&token=447597716&lang=zh_CN#rd) * [我入侵了隔壁妹子的Wifi,发现。。。](https://mp.weixin.qq.com/s?__biz=Mzg4MjU0OTM1OA==&mid=2247489208&idx=1&sn=5703e777f9aab5277ef840555f12421b&chksm=cf55a1b9f82228af92b558dfe44063ad4ba2622329737bb4be39597b673449af970b71ecf32c&token=776659970&lang=zh_CN#rd) * [《冰河的渗透实战笔记》全网下载突破5W+](https://mp.weixin.qq.com/s?__biz=Mzg4MjU0OTM1OA==&mid=2247499927&idx=1&sn=69d8fd345b7f74abe492b40acedcc8b6&chksm=cf567796f821fe80d720b9ab0c7b8f4a953129e35ed01fa46f5fed230bdde77eb3054d3ffb3f&token=2138570566&lang=zh_CN#rd) * [冰河连夜复现了Log4j最新史诗级重大漏洞,含视频和完整案例代码,全网最全!!](https://mp.weixin.qq.com/s?__biz=Mzg4MjU0OTM1OA==&mid=2247501317&idx=1&sn=7a7d41d95b4d705e3952008b3d96d9cc&chksm=cf567104f821f81279ea31bae4c38a1f78a7e2637560c4e5807592af3ba2ef91ec7e78240a24&token=2138570566&lang=zh_CN#rd) ### 精通大数据系列 * [极简介绍Zookeeper,我想跟你聊这些!](https://mp.weixin.qq.com/s?__biz=Mzg4MjU0OTM1OA==&mid=2247492408&idx=1&sn=e5d8a1d3e8d4d28f836c11d631b66667&chksm=cf565439f821dd2fa002c09af2c2909e91dcf0e1917ca74e4bcf221530d99f246ec49bcaf5cb&token=78679959&lang=zh_CN#rd) * [精通Zookeeper系列开篇:进大厂不得不学的分布式协同利器!](https://mp.weixin.qq.com/s?__biz=Mzg4MjU0OTM1OA==&mid=2247492272&idx=1&sn=39b21a0f0da6629c72bf50fb21f51479&chksm=cf5655b1f821dca776ab19af8b477bdf590005606742dc4df30d34921b1b05608cb54cc4ba72&token=1053291757&lang=zh_CN#rd) ### 精通运维系列 * [一周时间成功编译了Linux操作系统源码,快要裂开了!!](https://mp.weixin.qq.com/s?__biz=Mzg4MjU0OTM1OA==&mid=2247495929&idx=1&sn=c776b5b315247bd41b3f506ba7d0d0d4&chksm=cf5647f8f821ceeef76c7f099afd46341e64f0f6bee4779a66b4b4786de869245b5c370d1c7d&token=546222828&lang=zh_CN#rd) * [在服务器命令行输入rm -rf /* 究竟会发生什么?](https://mp.weixin.qq.com/s?__biz=Mzg4MjU0OTM1OA==&mid=2247493167&idx=1&sn=0bb0055b6705becf7904bd5fd347a67f&chksm=cf56512ef821d83865cd13a5bbbdc5efba101c84d59aecbee6df0d0651d161e7c47b9163620e&token=447597716&lang=zh_CN#rd) * [我写了一个脚本,可在“任意”服务器上执行命令!](https://mp.weixin.qq.com/s?__biz=Mzg4MjU0OTM1OA==&mid=2247492340&idx=1&sn=35464589fc36bf80e09231d5f08d351b&chksm=cf5655f5f821dce3015bc74fa28f55e2dac08fa3a24e4ce3a99eaebc28b3ad115371b35ba5ae&token=1053291757&lang=zh_CN#rd) * [如何在服务器上添加虚拟IP?看完原来如此简单!!](https://mp.weixin.qq.com/s?__biz=Mzg4MjU0OTM1OA==&mid=2247489137&idx=1&sn=95c8b1dffe0734c23a82105d38733eda&chksm=cf55a170f82228664eebe997563ada7a1fb413e42a7ab2ca9714bf79004537aa60f587a520fe&token=776659970&lang=zh_CN#rd) * [你知道如何在Linux操作系统上添加虚拟IP吗?这篇文章帮你搞定](https://mp.weixin.qq.com/s?__biz=Mzg4MjU0OTM1OA==&mid=2247489463&idx=1&sn=77957646664d4c5c4cb86612810d9a1e&chksm=cf55a0b6f82229a05d2440353a436a8d6409b50a70db08fc83696d1bd4582b78d6f135c70189&token=776659970&lang=zh_CN#rd) * [你知道如何在Linux服务器上挂载新硬盘吗?冰河为你揭开这个秘密](https://mp.weixin.qq.com/s?__biz=Mzg4MjU0OTM1OA==&mid=2247489454&idx=1&sn=137088bd11428158b3d6cd8317c350cf&chksm=cf55a0aff82229b9cbea3e88e3323b3515af3389da5b69940ff9b64476e363cfeff4729728e1&token=776659970&lang=zh_CN#rd) ### 吃透MySQL系列 * [一招教你防止删库跑路!!](https://mp.weixin.qq.com/s?__biz=Mzg4MjU0OTM1OA==&mid=2247496625&idx=1&sn=244a98d5c672ffffe85bb17059f200aa&chksm=cf5644b0f821cda655977b9ad0ceb1cbd4296e2cd88c1fdd8d95e0b49819c0bb95fb1bd05ee9&token=627091476&lang=zh_CN#rd) * [基于MVCC,我用C++自己手撸了个MySQL!!](https://mp.weixin.qq.com/s?__biz=Mzg4MjU0OTM1OA==&mid=2247500275&idx=1&sn=5cc8c5b914554fc5b635675a3013817c&chksm=cf5676f2f821ffe4bebd11f6645d48226274dc282860a3f4779aa5dbcff87524ae5fafa7eaa4&token=2138570566&lang=zh_CN#rd) * [糟了,生产环境MySQL主从数据库不一致怎么办?!](https://mp.weixin.qq.com/s?__biz=Mzg4MjU0OTM1OA==&mid=2247496586&idx=2&sn=358d974f6a660d40263cd0d969f40570&chksm=cf56448bf821cd9dbbe948d795f634c0e5e2bc0fea1466d86d45d07b3afc5ffe4895d7f61afc&token=627091476&lang=zh_CN#rd) * [实际案例:MySQL主键性能压测!!](https://mp.weixin.qq.com/s?__biz=Mzg4MjU0OTM1OA==&mid=2247496514&idx=1&sn=5af221bf9fb726e9efb9e436cf3092fa&chksm=cf564443f821cd55b8788dccfb3ff5d9d16d3d36de86c572f7b3b933454f7723ba86d16beaa3&token=627091476&lang=zh_CN#rd) * [MySQL8索引篇:性能提升了100%!!](https://mp.weixin.qq.com/s?__biz=Mzg4MjU0OTM1OA==&mid=2247495115&idx=1&sn=d95725917c7ee850664e54ef0f3c4f66&chksm=cf565acaf821d3dc7ab3527b0ffe89e65e239b0d81a682de308a28b4d6789340cd243028ce2a&token=1081687632&lang=zh_CN#rd) * [卧槽,安装完MySQL竟然提示数据表不存在!!](https://mp.weixin.qq.com/s?__biz=Mzg4MjU0OTM1OA==&mid=2247492973&idx=1&sn=c397b6e5888bca04466843fdc2791b70&chksm=cf56526cf821db7a5b96ea223fe095fb44997d6f3f1c7e42a6d1396f7bcb62e4f2a39abd85b8&token=1352513703&lang=zh_CN#rd) * [冰河视频聊架构:三分钟带你彻底掌握MySQL Undo Log和MVCC机制!!](https://mp.weixin.qq.com/s?__biz=Mzg4MjU0OTM1OA==&mid=2247492818&idx=1&sn=f25ff876866929a84a2afc2464420fd1&chksm=cf5653d3f821dac563efbd2c97ed38b25872a7ce22056e31a023ff80ae8f4e64de08aeb84eb2&token=1574716246&lang=zh_CN#rd) * [一文搞懂select语句在MySQL中的执行流程!](https://mp.weixin.qq.com/s?__biz=Mzg4MjU0OTM1OA==&mid=2247492253&idx=1&sn=6b943ad940c8db578fc0259b09c3d853&chksm=cf56559cf821dc8a4cd8bf87db9c5f615183e2f6ca4e7e20e043403880d89a0d2ceff7e575cf&token=154776380&lang=zh_CN#rd) * [小伙伴们说要看MySQL体系架构的视频,我连夜整理出来了!](https://mp.weixin.qq.com/s?__biz=Mzg4MjU0OTM1OA==&mid=2247492240&idx=1&sn=98995b9dc5b749ad285139d6e45c3a4b&chksm=cf565591f821dc87d4143d70bc1a28c0bd503322de0288387c4725c93229210a895a7fbdd7e9&token=154776380&lang=zh_CN#rd) * [MySQL优化思路](https://mp.weixin.qq.com/s?__biz=Mzg4MjU0OTM1OA==&mid=2247490929&idx=1&sn=e8d98d0c9c357f32719af064b230cfa2&chksm=cf55aa70f8222366eea0a124adc70c5c2092c6e528414297f4b8e17deb6f3ebde56603154b41&token=776659970&lang=zh_CN#rd) * [业务高峰期不小心拔掉了服务器电源!!](https://mp.weixin.qq.com/s?__biz=Mzg4MjU0OTM1OA==&mid=2247491043&idx=1&sn=03edc8bad89bb5f6176f1b1fa6011f2e&chksm=cf55aae2f82223f4e2fe53a0a1c1e1a8a49d0cd069f53b1c7c9b00553b5f51b1e541426e79ef&token=776659970&lang=zh_CN#rd) * [一文搞懂MySQL体系架构!!](https://mp.weixin.qq.com/s?__biz=Mzg4MjU0OTM1OA==&mid=2247491060&idx=1&sn=2fd0339f2b915a6a1eb63938883b3c18&chksm=cf55aaf5f82223e35f04c043dabd7204c3593a70e4be209d9b8bb9bb0e05c2a13b5949fd6a87&token=776659970&lang=zh_CN#rd) * [关于MySQL日志,我与阿里P9都聊了些啥?](https://mp.weixin.qq.com/s?__biz=Mzg4MjU0OTM1OA==&mid=2247491119&idx=1&sn=321d6b7adaa98b2039111b01fe3515c0&chksm=cf55a92ef82220381474e43cdfe5cf8ec070a13c2279475c9f1c0b4e9faae7fc841a24214257&token=776659970&lang=zh_CN#rd) * [记一次MySQL崩溃修复案例,再也不用删库跑路了!!](https://mp.weixin.qq.com/s?__biz=Mzg4MjU0OTM1OA==&mid=2247489326&idx=1&sn=dcef8d1cb8f73b1610ff283a2f9277d4&chksm=cf55a02ff82229396c0687786aa9f8fc0e955cee2def9b0e0c50404f307bf5a6c052061db0dc&token=776659970&lang=zh_CN#rd) * [如何最大程度防止人为误操作MySQL数据库?这次我懂了!!](https://mp.weixin.qq.com/s?__biz=Mzg4MjU0OTM1OA==&mid=2247489238&idx=1&sn=ba8ae7148dbdaf055c3ab6a89dc416e9&chksm=cf55a1d7f82228c173cdd3b13612f713119467d6b0bf74f093b37132c85fca128a6c5deed16e&token=776659970&lang=zh_CN#rd) * [记一次线上重大事故:二狗子竟然把线上数据库删了!!](https://mp.weixin.qq.com/s?__biz=Mzg4MjU0OTM1OA==&mid=2247489225&idx=1&sn=f45605e2e24cbaf0ff35eda79f0620c9&chksm=cf55a1c8f82228deb4abba94e261e6ff04dcc014f66c2fd8e73e66213ff38ee62965abacb644&token=776659970&lang=zh_CN#rd) * [面试官问我:MySQL如何实现无数据插入,有数据更新?我是这样回答的!](https://mp.weixin.qq.com/s?__biz=Mzg4MjU0OTM1OA==&mid=2247489217&idx=1&sn=d4242cf147e077da15eb53ca0a63980b&chksm=cf55a1c0f82228d6b0ba820757acbf0156dc1b370d5ebb333ed1874448fd2253a0b84c9ef8c4&token=776659970&lang=zh_CN#rd) * [我这样分析MySQL中的事务,面试官对我刮目相看!!](https://mp.weixin.qq.com/s?__biz=Mzg4MjU0OTM1OA==&mid=2247489207&idx=1&sn=dea22de9b4810b31e1836ec7f6153ccc&chksm=cf55a1b6f82228a0d952f1cb1500cf04949670beced0c05a54b0ca284ae8590de2577fd6fa99&token=776659970&lang=zh_CN#rd) * [如何添加新数据库到MySQL主从复制环境?](https://mp.weixin.qq.com/s?__biz=Mzg4MjU0OTM1OA==&mid=2247489183&idx=1&sn=a966df8e4a942154b677c7cd23cb16a4&chksm=cf55a19ef8222888ede8e67182c95960d0ae5e43c12b659424623a62633108801c7ae9958912&token=776659970&lang=zh_CN#rd) * [如何查询和删除MySQL中重复的记录?](https://mp.weixin.qq.com/s?__biz=Mzg4MjU0OTM1OA==&mid=2247489175&idx=1&sn=72a43c2f94dac0f482d811cb2e60e56c&chksm=cf55a196f8222880c8a115bd891552aa7dd6907051534af601850c9348d57e5df48b5f3694ef&token=776659970&lang=zh_CN#rd) * [怒肝两个月MySQL源码,我总结出这篇2W字的MySQL协议详解(超硬核干货)!!](https://mp.weixin.qq.com/s?__biz=Mzg4MjU0OTM1OA==&mid=2247489153&idx=1&sn=6195de2d42d61499edc1953d595f07a3&chksm=cf55a180f822289613357b559b46788a3985a070f20a442ecedaaf34ff7b9c94b44aef9d578e&token=776659970&lang=zh_CN#rd) * [如何以最高的效率从MySQL中随机查询一条记录?](https://mp.weixin.qq.com/s?__biz=Mzg4MjU0OTM1OA==&mid=2247489133&idx=1&sn=434ca533aee95e2a0e4307f4f5f48345&chksm=cf55a16cf822287af928ff88aed8826a5e3bad5fb51744a539e29733f867ad3abb48cbcd6948&token=776659970&lang=zh_CN#rd) * [从零到千万用户,我是如何一步步优化MySQL数据库的?](https://mp.weixin.qq.com/s?__biz=Mzg4MjU0OTM1OA==&mid=2247489128&idx=1&sn=bfd3e00d8c8f8e4f259bf5e6b9e49f16&chksm=cf55a169f822287ffc79194e3b8103fb04ec7391d2a89ab3fb9d7c046243d2a04f414917253e&token=776659970&lang=zh_CN#rd) * [MySQL主从复制的原理和注意事项](https://mp.weixin.qq.com/s?__biz=Mzg4MjU0OTM1OA==&mid=2247489124&idx=1&sn=99436502e91fc1b0ae1abc75676a9f38&chksm=cf55a165f822287311a5751dbfae0df37614a412008ae3c4072924ef16d2c91ec9f6239ec588&token=776659970&lang=zh_CN#rd) * [如何最大程度防止人为误操作MySQL数据库?这次我懂了!!](https://mp.weixin.qq.com/s?__biz=Mzg4MjU0OTM1OA==&mid=2247489123&idx=1&sn=0fefb3c14f0ca8dc29f8614fafea5086&chksm=cf55a162f822287477ea95fd59f655f04227bfb61c3e6bf61419f2a93ed75fced8cf2f6ab44e&token=776659970&lang=zh_CN#rd) * [糟了,生产环境MySQL主从数据库不一致怎么办?!](https://mp.weixin.qq.com/s?__biz=Mzg4MjU0OTM1OA==&mid=2247489119&idx=1&sn=680c76b105c1b482dd9ecf335a25ec21&chksm=cf55a15ef82228485d75c580ba722029b2ec68f403314ae7ba5ff83cedc2b4ed7d5cad3c5c04&token=776659970&lang=zh_CN#rd) * [MySQL如何实现查询数据并根据条件更新到另一张表?](https://mp.weixin.qq.com/s?__biz=Mzg4MjU0OTM1OA==&mid=2247489118&idx=1&sn=7794f3c26c4404da9c389fa080837146&chksm=cf55a15ff82228497a7de6f3969678003971bdbc8fcf0af6f22e39893b2ab097e975f377289c&token=776659970&lang=zh_CN#rd) * [怒肝四个月MySQL源码,我总结出这篇MySQL协议(详尽版,文末有福利)!!](https://mp.weixin.qq.com/s?__biz=Mzg4MjU0OTM1OA==&mid=2247489117&idx=1&sn=7699ded5829ca2a9bcd73ab8ab77fcfa&chksm=cf55a15cf822284af99b74917eef7dec4ea807c76b3c6a283f318c7b7a64fa2c81c55ddf57a6&token=776659970&lang=zh_CN#rd) * [结合实战,我为MySQL主从复制总结了几张图!](https://mp.weixin.qq.com/s?__biz=Mzg4MjU0OTM1OA==&mid=2247489108&idx=1&sn=f0f74fbe01c0089ee854f95389786ce9&chksm=cf55a155f8222843504d2222944cda367f1dd20a92ae22ce4bbe01442d450e5d8c96287f11f1&token=776659970&lang=zh_CN#rd) * [源码编译MySQL8.x+升级gcc+升级cmake(亲测完整版)](https://mp.weixin.qq.com/s?__biz=Mzg4MjU0OTM1OA==&mid=2247489464&idx=1&sn=42dc9cb83bfc7db2c9a4953ecd97a5d0&chksm=cf55a0b9f82229af08ac3c20da71c6123033244611c9a9324cd8e86d14a321d1bb8fba0c1ad0&token=776659970&lang=zh_CN#rd) * [你知道MySQL 8.0中的索引有哪些新特性吗?看这一篇就够了!!!](https://mp.weixin.qq.com/s?__biz=Mzg4MjU0OTM1OA==&mid=2247489457&idx=1&sn=e36a83596c5908ab5daae53b38f3b21b&chksm=cf55a0b0f82229a687404abf15cf84fd435c7d1e51477bfe9fc9d7212803d8eece18644b2215&token=776659970&lang=zh_CN#rd) * [记一次MySQL崩溃修复案例,再也不用删库跑路了](https://mp.weixin.qq.com/s?__biz=Mzg4MjU0OTM1OA==&mid=2247489455&idx=1&sn=1287ebe3295056b4fcb5cc780f040a80&chksm=cf55a0aef82229b8680c171b980218a6ab7b17d479477e4407645e7c846535f90bc0e8512a69&token=776659970&lang=zh_CN#rd) ### Mycat系列 * [作为Mycat核心开发者,怎能不来一波Mycat系列文章?](https://mp.weixin.qq.com/s?__biz=Mzg4MjU0OTM1OA==&mid=2247489127&idx=1&sn=4588699167a43f2965ac15c560acbe9c&chksm=cf55a166f82228707b06b01b686979003d3038b38a111c94b185e110253aedb9bc13939dcfee&token=776659970&lang=zh_CN#rd) * [Mycat核心开发者带你看尽Mycat三大核心配置文件!!](https://mp.weixin.qq.com/s?__biz=Mzg4MjU0OTM1OA==&mid=2247489126&idx=1&sn=abe86c8bdb0bdccbe32ca8998ccd1527&chksm=cf55a167f822287118ad92e96b32513e615aac27a467a6a8c59a7c0c1ddfcf11a801f323521b&token=776659970&lang=zh_CN#rd) * [Mycat核心开发者带你轻松掌握Mycat路由转发!!](https://mp.weixin.qq.com/s?__biz=Mzg4MjU0OTM1OA==&mid=2247489125&idx=1&sn=2020aa02b49ea0a9dd59f9510a600e54&chksm=cf55a164f8222872b62e790224467df0c8b9df48abacbecb7fc2dce4e3c85e3cb803a8358a84&token=776659970&lang=zh_CN#rd) * [MySQL如何实现万亿级数据存储?](https://mp.weixin.qq.com/s?__biz=Mzg4MjU0OTM1OA==&mid=2247489111&idx=1&sn=41ceba746b58dd0d906f376928ceaa0b&chksm=cf55a156f8222840ac0c305e66cd3929a83216b044ca495169d38d39ab568590b38a6d076218&token=776659970&lang=zh_CN#rd) * [冰河,能讲讲Mycat如何实现MySQL的读写分离吗?](https://mp.weixin.qq.com/s?__biz=Mzg4MjU0OTM1OA==&mid=2247489098&idx=1&sn=b3bf7027960306aa8054efbb235257a6&chksm=cf55a14bf822285da48ae70121f8b604365d7791a60930cb35adf5dd7649c8ad550abb2bd334&token=776659970&lang=zh_CN#rd) * [海量数据架构下如何保证Mycat的高可用?](https://mp.weixin.qq.com/s?__biz=Mzg4MjU0OTM1OA==&mid=2247489096&idx=1&sn=1be285069a2153f494a81dc5f9aef708&chksm=cf55a149f822285f487b47a0bc5a0b421a14cf2dfd4ebdd8ae817167b5ffabe4f45676f65dca&token=776659970&lang=zh_CN#rd) * [冰河,能不能讲讲如何实现MySQL数据存储的无限扩容?](https://mp.weixin.qq.com/s?__biz=Mzg4MjU0OTM1OA==&mid=2247489525&idx=1&sn=3a2d41612b208f67ead0c114fc6f7ca8&chksm=cf55a0f4f82229e29a03397d11a4cb366f1d6870d0b94d701ecd12ba5a89bedbd11a06402795&token=776659970&lang=zh_CN#rd) ### Oracle专题 * [什么?作为DBA,你竟然不会安装Oracle??](https://mp.weixin.qq.com/s?__biz=Mzg4MjU0OTM1OA==&mid=2247489245&idx=1&sn=58f4e9058aebca52cc4a99bbf55eaa5b&chksm=cf55a1dcf82228ca9e426472e96cbf176d67654ed1f79b740d9a619b03fc8a40226f206455e7&token=776659970&lang=zh_CN#rd) * [你的Oracle没有开机自启?那是你没有这样操作!!](https://mp.weixin.qq.com/s?__biz=Mzg4MjU0OTM1OA==&mid=2247489244&idx=1&sn=7eaf1ded3a965a05f1142d2ff8974790&chksm=cf55a1ddf82228cbdc8940933d838919bc1b7dbcffd4ce81fdf0660a8515e9549f8946276e58&token=776659970&lang=zh_CN#rd) * [什么?你在物理机上安装Oracle总失败?那就试试Docker吧!!](https://mp.weixin.qq.com/s?__biz=Mzg4MjU0OTM1OA==&mid=2247489243&idx=1&sn=e801c592f04cb63ab28872e5672af944&chksm=cf55a1daf82228cc14bd7611655c2bec32415502d64d2db5e82ccb021e027d952dc0b54c9825&token=776659970&lang=zh_CN#rd) * [还想着花钱实现Oracle数据同步迁移?一招教你搞定!!](https://mp.weixin.qq.com/s?__biz=Mzg4MjU0OTM1OA==&mid=2247489242&idx=1&sn=6d3b26339aea13ef592540064c2ef745&chksm=cf55a1dbf82228cdf4c3e154df9a880f158132245ae5ba15cfab5738319f637c2e9dcdd006bd&token=776659970&lang=zh_CN#rd) * [Oracle如何查看所有表和字段以及表注释和字段注释?其实很简单!!](https://mp.weixin.qq.com/s?__biz=Mzg4MjU0OTM1OA==&mid=2247489241&idx=1&sn=92af761e491a454865c8d651841c6bb0&chksm=cf55a1d8f82228ceb800812fa7f2161e9da56b39c17ff6ed50a9ccbd00b5385e0c95cdf950d1&token=776659970&lang=zh_CN#rd) * [手动安装和卸载Oracle,这是最全的一篇了!!](https://mp.weixin.qq.com/s?__biz=Mzg4MjU0OTM1OA==&mid=2247489237&idx=1&sn=6bf9ac0d22ccb317ede2d89a356e8334&chksm=cf55a1d4f82228c2c6e13f74e46644bbed3161484c9b5b52575844f9db7685ddfd874b763f6a&token=776659970&lang=zh_CN#rd) ### 程序员进阶系列 * [分库分表有哪些思路和技巧?(建议收藏)](https://mp.weixin.qq.com/s?__biz=Mzg4MjU0OTM1OA==&mid=2247497603&idx=1&sn=915322da262aaa7d2c8626038e748268&chksm=cf564082f821c99406b0b9cc1a7e6ab4d940d9c5abba96c2053030abce138046071d8dc57cbf&token=1170633251&lang=zh_CN#rd) * [周末给大家推荐一款无需注解的 SpringBoot API文档生成神器!](https://mp.weixin.qq.com/s?__biz=Mzg4MjU0OTM1OA==&mid=2247500966&idx=1&sn=60e2d84f7546b1553975c5b0f5c22e4e&chksm=cf5673a7f821fab11bfe6e6d243c4d7f69786835b6e409fd84b598740d7ceb3844cc25fdcbc3&token=2138570566&lang=zh_CN#rd) * [操作系统基础知识总结(重要)](https://mp.weixin.qq.com/s?__biz=Mzg4MjU0OTM1OA==&mid=2247499626&idx=1&sn=d6475af51bf2d8551d1ded75828b9cf4&chksm=cf56486bf821c17dd1b34456c56c0d61e0a8998d0b85fd9e19fa3d4c6a991caffa0bbffc957f&token=2138570566&lang=zh_CN#rd) * [肝了一个月总结的Linux命令(建议收藏)](https://mp.weixin.qq.com/s?__biz=Mzg4MjU0OTM1OA==&mid=2247499900&idx=1&sn=81bb99956c766ab4bf72cbf1f9d61156&chksm=cf56777df821fe6b35e680e942b1d6e24b0bc2f580d77c2dd52f00195f172aaa51c49445fba0&token=2138570566&lang=zh_CN#rd) * [冰河为啥有很多时间写文章,录视频?关键在于事半功倍!!](https://mp.weixin.qq.com/s?__biz=Mzg4MjU0OTM1OA==&mid=2247499538&idx=1&sn=22791b4ddd223aebf860490bf03dbdf9&chksm=cf564813f821c10539106c1b2e7010285ad99921732e051f06722e836ec9c55500b097a4cfd1&token=2138570566&lang=zh_CN#rd) * [春节假期跟冰河学这些内容(建议收藏)](https://mp.weixin.qq.com/s?__biz=Mzg4MjU0OTM1OA==&mid=2247499374&idx=1&sn=f29219ea4e451fcfc1c43e60a3361aea&chksm=cf56496ff821c0791448d356d21fe8e40751f46522e241d52bcc2f7a8fd9a0b05385d8357c39&token=2138570566&lang=zh_CN#rd) * [一段很有意思的代码!!](https://mp.weixin.qq.com/s?__biz=Mzg4MjU0OTM1OA==&mid=2247497594&idx=1&sn=6d3a0de703dc32e1593eee5b9bc6b297&chksm=cf56407bf821c96df496de14273c819a33010e4a50b98c3cec9193f6fb435d2a4eabd6fe3e6b&token=1170633251&lang=zh_CN#rd) * [如何提高编码效率?yyds!!(建议收藏)](https://mp.weixin.qq.com/s?__biz=Mzg4MjU0OTM1OA==&mid=2247497004&idx=1&sn=fd73245e65b0c83aeb6bac39f9313248&chksm=cf56422df821cb3beeb185c1816374c7bdfc337bd439e6a7ae2a812e677688ec3e3c4b6d9a46&token=1170633251&lang=zh_CN#rd) * [冰河常逛的23大学习网站(建议收藏)!!](https://mp.weixin.qq.com/s?__biz=Mzg4MjU0OTM1OA==&mid=2247497003&idx=1&sn=87fd7485c213e250362019fc14075d9d&chksm=cf56422af821cb3cb7318048bfbfadc820ddfda9844dc8a20472c262b901a237f092eb89e2e3&token=1170633251&lang=zh_CN#rd) * [8万字总结的Java8新特性!!](https://mp.weixin.qq.com/s?__biz=Mzg4MjU0OTM1OA==&mid=2247496648&idx=1&sn=06e0edf146e9e81092a4f894b20f27b0&chksm=cf5644c9f821cddf208a172144dec6c866928bfdbb6967c6c51464b9f0fe964b281c7833346b&token=627091476&lang=zh_CN#rd) * [这两年,我摊牌了!!](https://mp.weixin.qq.com/s?__biz=Mzg4MjU0OTM1OA==&mid=2247496495&idx=1&sn=359ea5749af291e33bb58460a6295e97&chksm=cf56442ef821cd38e056a888a42271e042fd8932a0b557aee3d02da1c9c60951d2c59e934284&token=627091476&lang=zh_CN#rd) * [两年时间我出版了三本技术图书,发布了10本电子书,写了400多篇文章!!](https://mp.weixin.qq.com/s?__biz=Mzg4MjU0OTM1OA==&mid=2247496437&idx=1&sn=780293c19e88e15660d94644fe27b159&chksm=cf5645f4f821cce24bccd7200791c333a2914f1394340480d63f07526f07866a3bc4018f2ba9&token=627091476&lang=zh_CN#rd) * [项目配置不当引发了数据泄露,人已裂开!!](https://mp.weixin.qq.com/s?__biz=Mzg4MjU0OTM1OA==&mid=2247495520&idx=1&sn=d5793dd1f185748917584fac2235967d&chksm=cf565861f821d17791d58e2536239dcf18e3ed4d0d2116e576d2447d76c43e2fd5cc0c97f755&token=546222828&lang=zh_CN#rd) * [程序员废掉的十大铁律,你中了几个?](https://mp.weixin.qq.com/s?__biz=Mzg4MjU0OTM1OA==&mid=2247495133&idx=1&sn=c69e0fe0a342e6197f5957f5f4a53e51&chksm=cf565adcf821d3ca29d09531b71cefaf2531bba47a3e9d9eed4361d829d098cb1a10ebf94543&token=1081687632&lang=zh_CN#rd) * [被SpringBoot整合OpenFeign坑惨了!!](https://mp.weixin.qq.com/s?__biz=Mzg4MjU0OTM1OA==&mid=2247494602&idx=1&sn=1de5a61612a2679a893d1c52b729a1a7&chksm=cf565ccbf821d5dd1d14915d29100bcb9025b7fd8bbaa81d1c548a219d9af16947dc44c48606&token=1772864567&lang=zh_CN#rd) * [撸完Dubbo源码,我总结了这些阅读源码的技巧!!](https://mp.weixin.qq.com/s?__biz=Mzg4MjU0OTM1OA==&mid=2247494573&idx=1&sn=67e0016827b3a339c8cec216b8c4f7f7&chksm=cf565cacf821d5ba142a6f557203ac37cdae979ce1df14be78a42128ca9b268ff10cb5e0298b&token=1772864567&lang=zh_CN#rd) * [写了个工具,完美破解了MySQL!!](https://mp.weixin.qq.com/s?__biz=Mzg4MjU0OTM1OA==&mid=2247494142&idx=1&sn=5692a8facd99a0907358073a96381c3c&chksm=cf565efff821d7e9f8dcb276e7b75a2589a90e64bbe3345f8261e727db3a4b5903a48656509b&token=1772864567&lang=zh_CN#rd) * [写了个爬虫!](https://mp.weixin.qq.com/s?__biz=Mzg4MjU0OTM1OA==&mid=2247494101&idx=1&sn=e0649b642bd43a91610907aa0cb9a524&chksm=cf565ed4f821d7c2a2b1811f2787525ee6e1ae0a90edd705ab4ca5d306064e814d9746368f6a&token=1772864567&lang=zh_CN#rd) * [发现一个超硬核学习宝藏!爱了!爱了!](https://mp.weixin.qq.com/s?__biz=Mzg4MjU0OTM1OA==&mid=2247494087&idx=1&sn=3b3eee339ed21367476cd87bcf817467&chksm=cf565ec6f821d7d06824631c00f788205a685789065202800080eced0dea3d80b2492e559e38&token=1772864567&lang=zh_CN#rd) * [10分钟搞懂各种内存溢出案例!!](https://mp.weixin.qq.com/s?__biz=Mzg4MjU0OTM1OA==&mid=2247493602&idx=1&sn=abd5849009a189c31a6634e28bd4011f&chksm=cf5650e3f821d9f5cef8c4e743615e00fd969fadd303e5c6895b2963a1ca623fd38b37ea0fcc&token=1929495520&lang=zh_CN#rd) * [半小时实现Java手撸Http协议,爽!(含完整源码)](https://mp.weixin.qq.com/s?__biz=Mzg4MjU0OTM1OA==&mid=2247493570&idx=1&sn=6f2d69a8d25cf08f972085782eb4391a&chksm=cf5650c3f821d9d5a548350be94cd8b3236548fcf47be338f28d431a0f77efcf1a51aadb0729&token=1929495520&lang=zh_CN#rd) * [如何做到每天比别人少写200行代码?](https://mp.weixin.qq.com/s?__biz=Mzg4MjU0OTM1OA==&mid=2247493363&idx=1&sn=f7106317aa3cd53d0f87792e618f3239&chksm=cf5651f2f821d8e4d413c4897faa7241e635a22cdbd627657c4dd49346580a9687ca90a20f26&token=447597716&lang=zh_CN#rd) * [三天撸了个中国象棋!!](https://mp.weixin.qq.com/s?__biz=Mzg4MjU0OTM1OA==&mid=2247493007&idx=1&sn=09fd7170d10c33970838e9e82fd0dba6&chksm=cf56528ef821db988e4796f16ccf02de3a905f0d124b1c3bfa2f92d081eb3071c3c863c2427b&token=1352513703&lang=zh_CN#rd) * [如何将电脑上的“小电影”隐藏为一张图片?这波操作绝了!!](https://mp.weixin.qq.com/s?__biz=Mzg4MjU0OTM1OA==&mid=2247492847&idx=1&sn=98b59658ed4e8240c3417fb4f9034b74&chksm=cf5653eef821daf864720d926ca846f6efe7d96ee92054254db00b88a74cd1310d92748fbb05&token=1574716246&lang=zh_CN#rd) * [2.5万字54张图爆肝计算机与操作系统基础!!(建议收藏](https://mp.weixin.qq.com/s?__biz=Mzg4MjU0OTM1OA==&mid=2247492810&idx=1&sn=05adfc362125de1daedce9e20e476256&chksm=cf5653cbf821dadde9093e45b96c43291be0ee5e83e85525f1922c7f5c72e0e4b7528d8fa6dc&token=1574716246&lang=zh_CN#rd) * [Java中一个逐渐被遗忘的强大功能,强到你难以置信!!](https://mp.weixin.qq.com/s?__biz=Mzg4MjU0OTM1OA==&mid=2247492241&idx=1&sn=e9473f721cb79e27196aac7b978f13b7&chksm=cf565590f821dc869c8fb7fe2d13a04a4c797e3874b09fa26120343e8f057bdd098d45974ac9&token=154776380&lang=zh_CN#rd) * [图解计算机中数据的表示形式](https://mp.weixin.qq.com/s?__biz=Mzg4MjU0OTM1OA==&mid=2247490388&idx=1&sn=1805a62814452c598c567e65de274c67&chksm=cf55ac55f8222543d1506fb48dcc22c3d5be0a458e3b11a224fd98248e11f86d17468204fcfc&token=776659970&lang=zh_CN#rd) * [图解计算机中的数值范围和浮点数运算](https://mp.weixin.qq.com/s?__biz=Mzg4MjU0OTM1OA==&mid=2247490432&idx=1&sn=12b644c45d8b5bbc1379426a7c9eeee9&chksm=cf55ac81f8222597307784843d90dfe2f1f2c6b6fd09328070509bd79aaa3fa36b85b42131f2&token=776659970&lang=zh_CN#rd) * [图解计算机结构与体系分类!!](https://mp.weixin.qq.com/s?__biz=Mzg4MjU0OTM1OA==&mid=2247490467&idx=1&sn=6a3d54679657583b64f9167d9fdabbe4&chksm=cf55aca2f82225b4f88fe239cc8a80bb9d802befa5d5be74235970379af9891da246aa000c76&token=776659970&lang=zh_CN#rd) * [Http Header中到底有些啥?](https://mp.weixin.qq.com/s?__biz=Mzg4MjU0OTM1OA==&mid=2247490533&idx=1&sn=11bb993617094185a034c007f7fefa8f&chksm=cf55ace4f82225f2728de2d223da886fa1a9b4d39d243424ad26f99ae69f80fecf80eefa1707&token=776659970&lang=zh_CN#rd) * [计算机中的流水线技术到底是个啥?](https://mp.weixin.qq.com/s?__biz=Mzg4MjU0OTM1OA==&mid=2247490553&idx=1&sn=8f62d853786e57af9045ad98d1eefe16&chksm=cf55acf8f82225ee47f2e4dbec44142ca3d318034b6be07c364a8e0a9bb54fd1a78796621a3f&token=776659970&lang=zh_CN#rd) * [如何在服务器上进行双网卡双IP双网关配置?](https://mp.weixin.qq.com/s?__biz=Mzg4MjU0OTM1OA==&mid=2247490573&idx=1&sn=e9c39a58032ed20701079b4d6a1f02a6&chksm=cf55ab0cf822221a0ed0d395def3943079cb75b9b843a542120efc023afb33c7ae3d9d1df95c&token=776659970&lang=zh_CN#rd) * [计算机中的层次化存储究竟是个什么鬼?](https://mp.weixin.qq.com/s?__biz=Mzg4MjU0OTM1OA==&mid=2247490594&idx=1&sn=187341456d21256536b63e7caac8538a&chksm=cf55ab23f8222235078eb54e7356d981da5f64f4202668f46ccff9ee8938070541c99873259f&token=776659970&lang=zh_CN#rd) * [程序员心中的一道坎:主存的编址与计算和串并联系统!](https://mp.weixin.qq.com/s?__biz=Mzg4MjU0OTM1OA==&mid=2247490610&idx=1&sn=62ab9e5f3916d52ee4e6c1ddb56eb7a4&chksm=cf55ab33f82222254990d42b81ecd3c03d5e1636926f3347c409ab983e4dd6a95d8a2381b4dc&token=776659970&lang=zh_CN#rd) * [冰河公开了其总结的一项重要的编程技能!](https://mp.weixin.qq.com/s?__biz=Mzg4MjU0OTM1OA==&mid=2247490847&idx=1&sn=5647a499f0188e1355ee4d722825e2e1&chksm=cf55aa1ef822230830bdbafcdfa23ae9536dfd0999949ffd582be8300baeb5c83b20cb0d8800&token=776659970&lang=zh_CN#rd) * [项目中数据库密码没有加密导致了数据泄露!!](https://mp.weixin.qq.com/s?__biz=Mzg4MjU0OTM1OA==&mid=2247490951&idx=1&sn=c5a62f4355ca8c5dbced5f8da2cdae1f&chksm=cf55aa86f8222390ed02e43ce3f4330e712916eeccb37a051d4e5c42d6c0f2f217bfe0ae2ef5&token=776659970&lang=zh_CN#rd) ### Java8新特性 * [肝了一个月7.3万字的Java8新特性,我不信你能看完!](https://mp.weixin.qq.com/s?__biz=Mzg4MjU0OTM1OA==&mid=2247491839&idx=1&sn=6fd08962ec03b6d91638849d181eefc9&chksm=cf5657fef821dee8cddc5737620ab1949a8b66cbc3ad05c01355c93614282b4eea74309e11ba&token=1093611389&lang=zh_CN#rd) * [冰河带你看尽Java8新特性,你想要的都在这儿了!!(文本有福利)](https://mp.weixin.qq.com/s?__biz=Mzg4MjU0OTM1OA==&mid=2247489275&idx=1&sn=a5414c2e3224e262d40032646032414a&chksm=cf55a1faf82228ec6a73d6b7ea7b0dc14c839690353c305d1df2533d45bd48d1a98140544e21&token=776659970&lang=zh_CN#rd) * [一张图带你领略Java8有哪些新特性](https://mp.weixin.qq.com/s?__biz=Mzg4MjU0OTM1OA==&mid=2247489333&idx=1&sn=b96556e9c99d458c2936d4f0ca10ea5f&chksm=cf55a034f8222922fc23579a19c770390461a310252a673e86b6bfa54b49fa2a2dfc5714894b&token=776659970&lang=zh_CN#rd) * [Java8为什么要引入Lambda表达式?原来如此!!](https://mp.weixin.qq.com/s?__biz=Mzg4MjU0OTM1OA==&mid=2247489332&idx=1&sn=4d7e1e65fd44b3b8fc97df17bad58866&chksm=cf55a035f82229230d35c50b3c945bc57efd684baf318ea05ba57fb3e97ef00e968cca4d1907&token=776659970&lang=zh_CN#rd) * [Lambda表达式基础语法,都在这儿了!!](https://mp.weixin.qq.com/s?__biz=Mzg4MjU0OTM1OA==&mid=2247489331&idx=1&sn=4fbe2a287599ef613501008109fe0aa5&chksm=cf55a032f82229249e2db0c66018097653f108cdae1827d07df43cab408fdd6482e692c66902&token=776659970&lang=zh_CN#rd) * [Lambda表达式典型案例,你想要的的都在这儿了!!](https://mp.weixin.qq.com/s?__biz=Mzg4MjU0OTM1OA==&mid=2247489330&idx=1&sn=0738e60a13660080228da9f1be674f99&chksm=cf55a033f8222925de52335eec3767b226295541b13b530df36eb1d86fc99253798bff274913&token=776659970&lang=zh_CN#rd) * [还没搞懂函数式接口?赶快过来看看吧!](https://mp.weixin.qq.com/s?__biz=Mzg4MjU0OTM1OA==&mid=2247489329&idx=1&sn=73d3cd645fdecc68793efd9e11c89590&chksm=cf55a030f8222926acd31580950b8b41bca3e33319d96628cbb1571190ad619052e0e7b51a24&token=776659970&lang=zh_CN#rd) * [知识点总结,你都会了吗?](https://mp.weixin.qq.com/s?__biz=Mzg4MjU0OTM1OA==&mid=2247489328&idx=1&sn=f93ff6cfeb1bc444707f5b8bb0b07640&chksm=cf55a031f82229278e77d1158f203ec942197f2de70b523a76472aa89d70b67aabd373223e71&token=776659970&lang=zh_CN#rd) * [浅谈方法引用和构造器引用](https://mp.weixin.qq.com/s?__biz=Mzg4MjU0OTM1OA==&mid=2247489323&idx=1&sn=c95d1b281b1debf30718bb9b39d95c65&chksm=cf55a02af822293cdbbb03d837b7532048fa2e597814f23125633e23b483babb318fe7b42a24&token=776659970&lang=zh_CN#rd) * [关于Java8的Stream API,看这一篇就够了!!](https://mp.weixin.qq.com/s?__biz=Mzg4MjU0OTM1OA==&mid=2247489322&idx=1&sn=df97fb8de194ee92c509bb84777a5346&chksm=cf55a02bf822293d18a8d5fc50561ea5219a0d7099b999571aba800f9f2e962c2f8d4cc66df1&token=776659970&lang=zh_CN#rd) * [Java8中创建Stream流的方式](https://mp.weixin.qq.com/s?__biz=Mzg4MjU0OTM1OA==&mid=2247489319&idx=1&sn=a4300069c6d880ee8963baf3ed26e2ca&chksm=cf55a026f822293087d6c6236052249520c936fd5d8fde3bbb5adc4c8ccc4334949179759361&token=776659970&lang=zh_CN#rd) * [Stream API有哪些中间操作?看完你也可以吊打面试官!!](https://mp.weixin.qq.com/s?__biz=Mzg4MjU0OTM1OA==&mid=2247489317&idx=1&sn=462155310db1d09d9425208062c67643&chksm=cf55a024f8222932c921670c44fde7254620897bfd66d6d278d6d6881382988b732bc361ab64&token=776659970&lang=zh_CN#rd) * [谈谈Java8中的Stream API有哪些终止操作?](https://mp.weixin.qq.com/s?__biz=Mzg4MjU0OTM1OA==&mid=2247489316&idx=1&sn=265401ed9f736f0cdf5ce9cf75e620a8&chksm=cf55a025f82229330323603ffa1f365bc728ffe0759e428d7300fc43faee0b96788d5a15e117&token=776659970&lang=zh_CN#rd) * [关于并行流与串行流,你必须掌握这些!!](https://mp.weixin.qq.com/s?__biz=Mzg4MjU0OTM1OA==&mid=2247489315&idx=1&sn=2446ebe443df89a2f2ad10cab0901921&chksm=cf55a022f822293483c56d896646e79125b27bbce12f5ff7f58844a6a3d102e19deea63d9d17&token=776659970&lang=zh_CN#rd) * [不了解Optional类,简历上别说你懂Java8!!](https://mp.weixin.qq.com/s?__biz=Mzg4MjU0OTM1OA==&mid=2247489314&idx=1&sn=b6140423cacae11819863d3f9cd0e2d3&chksm=cf55a023f8222935cdf66cee0bf0f23a3b4ccf2e360997133c7f07f8c5e28788413807646b24&token=776659970&lang=zh_CN#rd) * [接口中的默认方法和静态方法,你都掌握了吗?](https://mp.weixin.qq.com/s?__biz=Mzg4MjU0OTM1OA==&mid=2247489313&idx=1&sn=87d9e82c51261aa3e46b96c803b95661&chksm=cf55a020f8222936400802323f9eb18b9ed0c41dffb1ccb3e37afda2b97351d4a421eb369cd2&token=776659970&lang=zh_CN#rd) * [关于Java8中的日期时间API,你需要掌握这些!!](https://mp.weixin.qq.com/s?__biz=Mzg4MjU0OTM1OA==&mid=2247489312&idx=1&sn=5c708966e3184c0b615b8ca8346297eb&chksm=cf55a021f8222937b3cc63332902bbcc8bb34057c757b4e02a6ef9fdcc1a336fc3af858c8657&token=776659970&lang=zh_CN#rd) * [不只是Java8的注解,你想要的都在这儿了!!](https://mp.weixin.qq.com/s?__biz=Mzg4MjU0OTM1OA==&mid=2247489311&idx=1&sn=d1b4d99b5d51879b7b1131d97f8474d1&chksm=cf55a01ef82229086a1913c10af0c1bfff2b18ab54931da92d87845161372febfb16cbf7c352&token=776659970&lang=zh_CN#rd) ### 云原生专题 * [五一假期服务器“裂开”了!!](https://mp.weixin.qq.com/s?__biz=Mzg4MjU0OTM1OA==&mid=2247493038&idx=1&sn=c69480fdb44a638109fb59c78e9499db&chksm=cf5652aff821dbb9bbd332ba95d44fdbada2d766b1ce5b0b58a0fae556ceeaa7007028f5be9d&token=1352513703&lang=zh_CN#rd) * [这次不鸽了,五一假期冰河为你整理了这些Dockerfile基础知识,拿走不谢!!](https://mp.weixin.qq.com/s?__biz=Mzg4MjU0OTM1OA==&mid=2247492988&idx=1&sn=c6576dda9fc329b374093b18215cdbd9&chksm=cf56527df821db6bd9a4042f32c81380d8d465edd5feffcdb2b3fc4a63ac40d1ab389f013daa&token=1352513703&lang=zh_CN#rd) * [三万字无坑搭建基于Docker+K8S+GitLab/SVN+Jenkins+Harbor持续集成交付环境!!](https://mp.weixin.qq.com/s?__biz=Mzg4MjU0OTM1OA==&mid=2247489107&idx=1&sn=60dbfea150102e88e1a0b465cbb0b448&chksm=cf55a152f8222844caaa1678f96576772ed51387d54345cfb29210333f1b6885ee48e8f28b09&token=776659970&lang=zh_CN#rd) * [冰河教你一次性成功安装K8S集群(基于一主两从模式)!!](https://mp.weixin.qq.com/s?__biz=Mzg4MjU0OTM1OA==&mid=2247489106&idx=1&sn=2b638241ab73714491707b09db69d3e1&chksm=cf55a153f82228453d9b0f472be5aadc47ea92ea0680961c6a8810b83ce4dc0eb657b8e49f98&token=776659970&lang=zh_CN#rd) * [卧槽,运行Docker容器时磁盘空间满了怎么办?](https://mp.weixin.qq.com/s?__biz=Mzg4MjU0OTM1OA==&mid=2247489103&idx=1&sn=756794976822639ff27e5591c8625177&chksm=cf55a14ef822285826891b5d86ec49ceb24511f8fd238d16e0a31df787f0a971126100391223&token=776659970&lang=zh_CN#rd) ### 开源框架与中间件系列 * [SpringBoot + Sharding JDBC,一文搞定分库分表、读写分离](https://mp.weixin.qq.com/s/11dQHtDwb51ZKheD0nkmaQ) * [周末撸了个Excel框架,现已开源,yyds!!](https://mp.weixin.qq.com/s?__biz=Mzg4MjU0OTM1OA==&mid=2247496134&idx=1&sn=b1eff30e4b0547586251d588c34245b2&chksm=cf5646c7f821cfd1a3723cc2a40f038b1719c676ba855cc588c4ccf0e59c0d1c90f400078e8c&token=1837443586&lang=zh_CN#rd) * [10分钟手撸极简版ORM框架!](https://mp.weixin.qq.com/s?__biz=Mzg4MjU0OTM1OA==&mid=2247499550&idx=1&sn=cf4dd9a8d5901cdf01a213994474ce1c&chksm=cf56481ff821c109a3e48b92ab5b258a4b5867e939c0a800167f027e51c38ebc11ce01eb0df9&token=2138570566&lang=zh_CN#rd) * [报告:冰河写了个开源框架!!](https://mp.weixin.qq.com/s?__biz=Mzg4MjU0OTM1OA==&mid=2247495476&idx=1&sn=a5ed410dd3c13691000d3e11086eb370&chksm=cf565835f821d123c7172408ec88dc0b66354c163419c87300fa304b5be9cec111c100d6c4df&token=1081687632&lang=zh_CN#rd) * [万字长文教你消息中间件到底该如何选型?](https://mp.weixin.qq.com/s?__biz=Mzg4MjU0OTM1OA==&mid=2247490146&idx=1&sn=b01d1bc7281b5bfdb93bddca2e01553e&chksm=cf55ad63f8222475086b63080ea945805da036102bb199bd9ef4bdff4facaf7fc7041d337015&token=776659970&lang=zh_CN#rd) * [两行代码修复了解析MySQL8.x binlog错位的问题!!](https://mp.weixin.qq.com/s?__biz=Mzg4MjU0OTM1OA==&mid=2247490278&idx=1&sn=1681ff3ce2d7ccc133d5a4436fb21359&chksm=cf55ade7f82224f145462689ed0b0e7d0e86e162c1707a954b30fcb293f997d60568f2466534&token=776659970&lang=zh_CN#rd) * [撸完Spring源码,我开源了这个分布式缓存框架!!](https://mp.weixin.qq.com/s?__biz=Mzg4MjU0OTM1OA==&mid=2247495719&idx=1&sn=077cc245c11605523210cc57057d356c&chksm=cf564726f821ce30b5aa4b9b6a1aa92c1f1ea9b2c46fef84bf59e2cce9065bcaece568c7a10d&token=546222828&lang=zh_CN#rd) * [万字长文解密数据异构最佳实践(含完整代码实现)!!](https://mp.weixin.qq.com/s?__biz=Mzg4MjU0OTM1OA==&mid=2247491249&idx=1&sn=f1e24d969b3c1b3f312241c423908184&chksm=cf55a9b0f82220a61f664b21b8c3709821e13db7e9a2bf32a45f7db72005b829e41c44ef8601&token=776659970&lang=zh_CN#rd) * [互联网背景下有哪些数据同步需求和解决方案?看完我知道了!!](https://mp.weixin.qq.com/s?__biz=Mzg4MjU0OTM1OA==&mid=2247489246&idx=1&sn=ae8ad62bed2f01065696481d183a7cac&chksm=cf55a1dff82228c9311046f61adad90346556bbbd68fced784ec163090a04809048ec7426d9a&token=776659970&lang=zh_CN#rd) * [数据同步的终极解决方案,阿里巴巴开源的Canal框架当之无愧!!](https://mp.weixin.qq.com/s?__biz=Mzg4MjU0OTM1OA==&mid=2247489240&idx=1&sn=f3f13ab7366494ece7961f1a0aa7a8dd&chksm=cf55a1d9f82228cf831cfbc2d7df4be5188a0a36ee3a4fa6a4bc1a893fd4e86bd43b961f66b2&token=776659970&lang=zh_CN#rd) * [面试官问我:看过sharding-jdbc的源码吗?我吧啦吧啦说了一通!!](https://mp.weixin.qq.com/s?__biz=Mzg4MjU0OTM1OA==&mid=2247489197&idx=1&sn=b53d0ca82519b368b55e0f36271fb75a&chksm=cf55a1acf82228bacf15f1b40da7ce3212ebbd394befe822c26bba19ff3ab68d07f5945ae3e5&token=776659970&lang=zh_CN#rd) * [我把这个贼好用的Excel导出工具开源了!!](https://mp.weixin.qq.com/s?__biz=Mzg4MjU0OTM1OA==&mid=2247489165&idx=1&sn=25ee699c884bda3c33ba1b4f414db31b&chksm=cf55a18cf822289a0a3b5d1747c1fd99529c4edd3f82b32f39e47afa48e027250f507949991a&token=776659970&lang=zh_CN#rd) * [贼好用,冰河开源了这款精准定时任务和延时队列框架!!](https://mp.weixin.qq.com/s?__biz=Mzg4MjU0OTM1OA==&mid=2247489113&idx=1&sn=64767b64063c4229b64ab53769e9ab6b&chksm=cf55a158f822284ea53f931766dfa1a8a77aea95faaeef83bb3f71bb13df84ea776f6f630b1f&token=776659970&lang=zh_CN#rd) * [冰河开源了全网首个完全开源的分布式全局有序序列号(分布式ID)框架!!](https://mp.weixin.qq.com/s?__biz=Mzg4MjU0OTM1OA==&mid=2247489099&idx=1&sn=6f5c110d4d6a4dc70abfb69f6a281dd2&chksm=cf55a14af822285cd13dd9d5c711e10c24e3e85c96eab05c0c04ce83207fac52446eee3d2004&token=776659970&lang=zh_CN#rd) * [贼好用,冰河再次升级了这款开源的精准定时任务和延时队列框架!!](https://mp.weixin.qq.com/s?__biz=Mzg4MjU0OTM1OA==&mid=2247489571&idx=1&sn=addba2666568ade7f260852f4da7463b&chksm=cf55af22f8222634df9518c397a6fdf0af6bd4a80c84c58e220792d4bee5a1c07f53e2ec833f&token=776659970&lang=zh_CN#rd) * [基于java开发的功能强大、配置灵活的数据库之间的同步工具](https://mp.weixin.qq.com/s?__biz=Mzg4MjU0OTM1OA==&mid=2247489469&idx=1&sn=bd879a614feb31ba6bccabcce54d39c1&chksm=cf55a0bcf82229aa674dc87cc377642f931a1a80bda469ee800a7a8386d6eac2cafbfcd224dd&token=776659970&lang=zh_CN#rd) * [RocketMQ之——单机环境搭建](https://mp.weixin.qq.com/s?__biz=Mzg4MjU0OTM1OA==&mid=2247489465&idx=1&sn=2a9d4a180d4fde60de167843f2544469&chksm=cf55a0b8f82229ae1b8931479ce3179c9d61906ce9f1d48de32e51284426877d67dd0f1547da&token=776659970&lang=zh_CN#rd) * [你知道如何实现分布式锁吗?这边看:分布式锁服务插件——mykit-lock开源啦!](https://mp.weixin.qq.com/s?__biz=Mzg4MjU0OTM1OA==&mid=2247489459&idx=1&sn=96fefb1d895d7543e4eb59a3c79c983c&chksm=cf55a0b2f82229a424ff630d2c01bc8b73e6932d5f77e194d690d266e0733549721305d1627c&token=776659970&lang=zh_CN#rd) * [mykit-excel插件是我用过的最好用的Excel导出工具,没有之一!!](https://mp.weixin.qq.com/s?__biz=Mzg4MjU0OTM1OA==&mid=2247489441&idx=1&sn=89c76952ab4f20eb45eeebf893bb3da1&chksm=cf55a0a0f82229b6a02b98e04641f664c899d0cc7709fd8a1ab07e1f9135f3758a3b0b85c5f5&token=776659970&lang=zh_CN#rd) ### 分布式缓存系列 * [缓存最关心哪些指标?](https://mp.weixin.qq.com/s?__biz=Mzg4MjU0OTM1OA==&mid=2247495996&idx=1&sn=27aa388ed9f841149746e8a2aa8546e5&chksm=cf56463df821cf2bf356191ec8c37085adc5852e2ca056fbd9da729873b3cbafca1169ae677b&token=1837443586&lang=zh_CN#rd) * [超强Redis数据类型与应用场景总结!!](https://mp.weixin.qq.com/s?__biz=Mzg4MjU0OTM1OA==&mid=2247493921&idx=1&sn=f1182dab854b9510298a352ed4be1b3e&chksm=cf565e20f821d736248b076d8708ef7ade53056b954ba164abfac8a4cdd617b12629bd5b8a14&token=1929495520&lang=zh_CN#rd) * [到底是先更新数据库还是先更新缓存?](https://mp.weixin.qq.com/s?__biz=Mzg4MjU0OTM1OA==&mid=2247491950&idx=1&sn=91f2fb8e87967c780b91772fb8f39df9&chksm=cf56566ff821df793e162510f97580e970eb84c1a39426febd920246edbdf721b5f35ab339c4&token=1093611389&lang=zh_CN#rd) * [Redis开篇与如何安装单机版Redis,这次我会了!!](https://mp.weixin.qq.com/s?__biz=Mzg4MjU0OTM1OA==&mid=2247489227&idx=1&sn=ed93bc18fdc643410a36f67111cede82&chksm=cf55a1caf82228dca00cc705f4adf779a71c707742a77d998a85f8fca278a810e4402fdaad98&token=776659970&lang=zh_CN#rd) ### 分布式存储 * [4.5万字手把手教你实现MySQL TB级数据存储!!](https://mp.weixin.qq.com/s?__biz=Mzg4MjU0OTM1OA==&mid=2247490090&idx=1&sn=6cfda5e97794e947bb42709b7197e0d3&chksm=cf55ad2bf822243d2a270752ba6b9ace01b24b8aa2b9f79de505e082b97d9a04691ce7e59936&token=776659970&lang=zh_CN#rd) * [记一次缓存服务器迁移史,心塞!](https://mp.weixin.qq.com/s?__biz=Mzg4MjU0OTM1OA==&mid=2247490170&idx=1&sn=6d8ae64d9f76754a2dccc12af8c57d2f&chksm=cf55ad7bf822246dd89ed1804b1ea319c0544297b301eabcb34807aea554d6c8422bd45121a2&token=776659970&lang=zh_CN#rd) * [如何扩容单台服务器的存储容量?](https://mp.weixin.qq.com/s?__biz=Mzg4MjU0OTM1OA==&mid=2247490300&idx=1&sn=ebd4555a8cf6f55369946a0e9a4a858a&chksm=cf55adfdf82224eb448c600284351ef24ad6d572a03099d93828c57847f63f36a987b0ee7c94&token=776659970&lang=zh_CN#rd) * [新年上班第一天生产环境分布式文件系统崩了!!](https://mp.weixin.qq.com/s?__biz=Mzg4MjU0OTM1OA==&mid=2247490878&idx=1&sn=89a0264f21d46cf36cf090c5f9612ba0&chksm=cf55aa3ff8222329ec181402b6942649b87511b972467fa6892dc61e63b0d9881625f252957c&token=776659970&lang=zh_CN#rd) * [冰河又一超硬核分布式存储PDF教程免费开源!!](https://mp.weixin.qq.com/s?__biz=Mzg4MjU0OTM1OA==&mid=2247489177&idx=1&sn=366f288aaadffbe4a78597ede8641390&chksm=cf55a198f822288ea86e122b5538762fe2aeeed909a1d6657522aa4d3f1851a172d44854c16e&token=776659970&lang=zh_CN#rd) * [FastDFS 分布式文件系统的安装与使用,看这一篇就够了!!](https://mp.weixin.qq.com/s?__biz=Mzg4MjU0OTM1OA==&mid=2247489280&idx=1&sn=2a0994167d73b70e7373735c5faa11a4&chksm=cf55a001f8222917ad5a0c914c9438ebd9e452ed95d7c059028d74adcfaf8259208b007cc206&token=776659970&lang=zh_CN#rd) * [如何打造一款高可用的分布式文件系统?这次我明白了!!](https://mp.weixin.qq.com/s?__biz=Mzg4MjU0OTM1OA==&mid=2247489279&idx=1&sn=57261571f8f5550572988a36520efb16&chksm=cf55a1fef82228e89144a7054c6fe70a9806ba0c890b532d91b2de14756461c747849a02e139&token=776659970&lang=zh_CN#rd) * [FastDFS 分布式文件系统的安装与使用,看这一篇就够了!!](https://mp.weixin.qq.com/s?__biz=Mzg4MjU0OTM1OA==&mid=2247489276&idx=1&sn=1ad82d9e39ef4e6634c7254ca17b548d&chksm=cf55a1fdf82228eb434d76ce360a9e2bdfda649c6d63037beaf51572cdca969eee9b8610bb84&token=776659970&lang=zh_CN#rd) * [小伙伴们说在CentOS 8服务器上搭建FastDFS环境总报错?](https://mp.weixin.qq.com/s?__biz=Mzg4MjU0OTM1OA==&mid=2247489180&idx=1&sn=17e2788c3bb37d3ea1244851ce62b8d4&chksm=cf55a19df822288b651d52ec464007da82c46bbba378b2d1031569a4bf269cdd52938fc2ade6&token=776659970&lang=zh_CN#rd) * [如何实现文件的大规模分布式存储?(全程实战)](https://mp.weixin.qq.com/s?__biz=Mzg4MjU0OTM1OA==&mid=2247489179&idx=1&sn=3d090d957d98724dc5ec90ee71264b63&chksm=cf55a19af822288c20a82c1e27e11a20b9a4899af3cb33ea061ef7cbacdabb999f29e6164a72&token=776659970&lang=zh_CN#rd) * [SpringBoot整合FastDFS实战(附完整工程源码)](https://mp.weixin.qq.com/s?__biz=Mzg4MjU0OTM1OA==&mid=2247489178&idx=1&sn=8a2571ec2145a9eae34bb8122d848f70&chksm=cf55a19bf822288d517fe3114e0c9cd96db96111e23ad6d7aa9b653d7f2387db99cffc448bfd&token=776659970&lang=zh_CN#rd) * [都是Nginx前缀惹的祸,FastDFS不能访问了!!](https://mp.weixin.qq.com/s?__biz=Mzg4MjU0OTM1OA==&mid=2247489104&idx=1&sn=c5af9ad42cd34aa2c5b47e773034df00&chksm=cf55a151f8222847123f4f598071a152ddfc0a88b7b888650c13833f212ec239aa063ca3717c&token=776659970&lang=zh_CN#rd) * [互联网大厂有哪些分库分表的思路和技巧?](https://mp.weixin.qq.com/s?__biz=Mzg4MjU0OTM1OA==&mid=2247489682&idx=1&sn=1ce0d3631bc1054d53767bc3c47321c9&chksm=cf55af93f8222685205e2d2be2a696cd3be7e8102f0edfb52d8276aa91a1e3818c779eb934c8&token=776659970&lang=zh_CN#rd) ### 微服务专题 * [冰河公开了这份微服务学习笔记!!](https://mp.weixin.qq.com/s?__biz=Mzg4MjU0OTM1OA==&mid=2247495884&idx=1&sn=fddfccd536fd2ed61aa013e9cde8b34e&chksm=cf5647cdf821cedba7ce6e8dd93d429e20f80647f2eed4d45fafb5fe8f9d1c7576211c3d91cc&token=546222828&lang=zh_CN#rd) * [系统由单体架构到微服务架构到底是如何演进的?](https://mp.weixin.qq.com/s?__biz=Mzg4MjU0OTM1OA==&mid=2247493667&idx=1&sn=2e0d21381a3f34d312c8a605e9144f4e&chksm=cf565f22f821d634243681ce40d19d396452b0156ebfe99c812c2d363d885511e3e9343d4e7f&token=1929495520&lang=zh_CN#rd) * [一个很多人不知道的SpringBoot小技能!!](https://mp.weixin.qq.com/s?__biz=Mzg4MjU0OTM1OA==&mid=2247491991&idx=1&sn=d4447165cfb7be4237d3cf439075a36e&chksm=cf565696f821df80c766f6f5af0803d492c24527f0bffd6a4028cac4a150e216e79b19ffe1e7&token=1093611389&lang=zh_CN#rd) * [冰河公开了这份4万字的SpringCloud与SpringCloudAlibaba学习笔记!!](https://mp.weixin.qq.com/s?__biz=Mzg4MjU0OTM1OA==&mid=2247490745&idx=1&sn=fb3cc2c3e94eec9e4dd4d5d3a33d8ffb&chksm=cf55abb8f82222aec0d694cb3e1a4f503d072e55a2cd6f35e2de1dee56c17f6cc92876cfe6e5&token=776659970&lang=zh_CN#rd) * [SpringCloud Alibaba开篇:SpringCloud这么火,为何还要学习SpringCloud Alibaba?](https://mp.weixin.qq.com/s?__biz=Mzg4MjU0OTM1OA==&mid=2247489145&idx=1&sn=aca42e8b89177c700c01064ecf0410c7&chksm=cf55a178f822286ea2725d9853f79c69c1757621cb8e974463a99c7650b6985a3014202e462d&token=776659970&lang=zh_CN#rd) * [系统从初期到支撑亿级流量,都经历了哪些架构上的演变?](https://mp.weixin.qq.com/s?__biz=Mzg4MjU0OTM1OA==&mid=2247489143&idx=1&sn=ee96ac3db5d2f7785dbc9ed5698e7a70&chksm=cf55a176f8222860fcb2c69e9f68e4f1ffcee6767dcb1e7dc009990468415893e11ed9e7fbde&token=776659970&lang=zh_CN#rd) * [卧槽,被SpringBoot整合原生OpenFegin坑惨了(非SpringCloud)!!](https://mp.weixin.qq.com/s?__biz=Mzg4MjU0OTM1OA==&mid=2247489141&idx=1&sn=6072866b3cb1338970601f25c1b3e46b&chksm=cf55a174f822286226e54fb62f317ab451be948b77cedeb2df5c35c8e025e8265542193ae6ad&token=776659970&lang=zh_CN#rd) * [我所经历的价值超10亿美元的直播系统架构图是什么样子的?](https://mp.weixin.qq.com/s?__biz=Mzg4MjU0OTM1OA==&mid=2247489129&idx=1&sn=b4493599559871587e3a03a630e957c3&chksm=cf55a168f822287e04d4127ca7a099ab21e92f144da67d9bda4fe4971689dd70f3f90d935fe5&token=776659970&lang=zh_CN#rd) * [实战SpringBoot 快速支持动态配置国际化](https://mp.weixin.qq.com/s?__biz=Mzg4MjU0OTM1OA==&mid=2247499964&idx=1&sn=5e44261a4066b81cb84ef8b58a6124a5&chksm=cf5677bdf821feabea938c78b53fa9d377cfe9520a4d49fd33f81e654e321f212fe02e14859c&token=2138570566&lang=zh_CN#rd) ### Nginx技术 * [记一次生产环境大面积404问题!](https://mp.weixin.qq.com/s?__biz=Mzg4MjU0OTM1OA==&mid=2247490982&idx=1&sn=a030fac171a108c148badbb627ee8f35&chksm=cf55aaa7f82223b13b5b74b784a79bc2d7ff13348647f014d717452a79a3290e87bb881bd2d7&token=776659970&lang=zh_CN#rd) * [一文带你搞懂Nginx如何配置Http、Https、WS、WSS!](https://mp.weixin.qq.com/s?__biz=Mzg4MjU0OTM1OA==&mid=2247491149&idx=1&sn=04baa8af488a6ce39bc369a64c5109a6&chksm=cf55a94cf822205a4144f97bda5a79fefb2433c565f108f5e46bf8854284e0481c51d668d5aa&token=776659970&lang=zh_CN#rd) * [冰河又一本超硬核Nginx PDF教程免费开源!!](https://mp.weixin.qq.com/s?__biz=Mzg4MjU0OTM1OA==&mid=2247489188&idx=1&sn=652e3f9f6845a0c4c3388b70308c033a&chksm=cf55a1a5f82228b376bb330c18fa9a6f6dd49548de0a2693b7239c0c7aebeda334ef122662e2&token=776659970&lang=zh_CN#rd) * [如何获取客户端真实IP、域名、协议、端口?看这一篇就够了!](https://mp.weixin.qq.com/s?__biz=Mzg4MjU0OTM1OA==&mid=2247489270&idx=1&sn=8f72307402b0e1924ac7a61c3ff1d9ed&chksm=cf55a1f7f82228e1e2c2043f875aeec3f90186500e15d7dfe99776a98f9b831b617cce788969&token=776659970&lang=zh_CN#rd) * [实现负载均衡、限流、缓存、黑白名单和灰度发布,这是最全的一篇了!](https://mp.weixin.qq.com/s?__biz=Mzg4MjU0OTM1OA==&mid=2247489269&idx=1&sn=0943fb133c9dfe72d682316c02144901&chksm=cf55a1f4f82228e2cfa3ffe1de0c3e90e5135ed5d1946bcc81bba489979a7ea4ecae2dac4ddf&token=776659970&lang=zh_CN#rd) * [面试官竟然问我Nginx如何生成缩略图,还好我看了这篇文章!!](https://mp.weixin.qq.com/s?__biz=Mzg4MjU0OTM1OA==&mid=2247489268&idx=1&sn=114d161cfc7ebab24a5936cfd35c85de&chksm=cf55a1f5f82228e3a3efe62dfc07d4b6917fdb45edd73a5a29984d02ca739ad6c0211669f346&token=776659970&lang=zh_CN#rd) * [如何封禁IP和IP段?看完这篇我会了!!](https://mp.weixin.qq.com/s?__biz=Mzg4MjU0OTM1OA==&mid=2247489267&idx=1&sn=c9763c7bd1d016be466b1d19bfb194d0&chksm=cf55a1f2f82228e4d965db2f1442f92611d4cdc9522763174146fc5bfc15d5a183c5517abf57&token=776659970&lang=zh_CN#rd) * [如何按日期分割Nginx日志?看这一篇就够了!!](https://mp.weixin.qq.com/s?__biz=Mzg4MjU0OTM1OA==&mid=2247489266&idx=1&sn=b8d81d7872b21c3bdc2b79954afb7a92&chksm=cf55a1f3f82228e5807042b2181f61fae9bb75fdd9170ac0badc1db59e4201bfe3cfedb77d63&token=776659970&lang=zh_CN#rd) * [如何为已安装的Nginx动态添加模块?看完我懂了!!](https://mp.weixin.qq.com/s?__biz=Mzg4MjU0OTM1OA==&mid=2247489265&idx=1&sn=31b5d53e14577097007bca8d9f99f4a7&chksm=cf55a1f0f82228e6f7a47dbea771a4ebbc7ca628320527ee9cbf041825068cb763848f3c12bc&token=776659970&lang=zh_CN#rd) * [如何格式化日志并推送到远程服务器?看完原来很简单!!](https://mp.weixin.qq.com/s?__biz=Mzg4MjU0OTM1OA==&mid=2247489264&idx=1&sn=b4af7a03bd247e26bbb0530a152a35ed&chksm=cf55a1f1f82228e7adc38a9685b22ca02e446f718376c137499c65f9ad613698c17c81d9371a&token=776659970&lang=zh_CN#rd) * [面试官问我Nginx如何配置WebSocket?我给他现场演示了一番!!](https://mp.weixin.qq.com/s?__biz=Mzg4MjU0OTM1OA==&mid=2247489262&idx=1&sn=e60506742aaefef71b55785ef9022030&chksm=cf55a1eff82228f99a9bb9eec79f91e5c2b1b08c7b0c650cb8de99262b1447fca6bc99fc3b7c&token=776659970&lang=zh_CN#rd) * [如何使用Nginx实现MySQL数据库的负载均衡?看完我懂了!!](https://mp.weixin.qq.com/s?__biz=Mzg4MjU0OTM1OA==&mid=2247489261&idx=1&sn=eebb6223c4f61cb2311145e081269393&chksm=cf55a1ecf82228faa0815550f3251695560ecc071569070124ea1d42fe6f46928ea8753c7bda&token=776659970&lang=zh_CN#rd) * [还不会使用Nginx解决跨域问题?肝这一篇就够了!!](https://mp.weixin.qq.com/s?__biz=Mzg4MjU0OTM1OA==&mid=2247489260&idx=1&sn=145c3ad54d74d6e4bc8eb0c1a21f5e4f&chksm=cf55a1edf82228fb7c1e14a1521873580c9754b6f19cdc7dce0ac281416df33c117da9aca9b2&token=776659970&lang=zh_CN#rd) * [图片显示过慢,文件下载不完全,竟然是Nginx的锅!!](https://mp.weixin.qq.com/s?__biz=Mzg4MjU0OTM1OA==&mid=2247489259&idx=1&sn=8a59e21a2a0cca10a9eff8d90d5fe921&chksm=cf55a1eaf82228fcbbf316b55a358617929e2c1d3636be231f35ff9c7a7d2e9d336314fcd2a1&token=776659970&lang=zh_CN#rd) * [如何使用Nginx搭建流媒体服务器实现直播?看完这篇我会了!!(文末有福利)](https://mp.weixin.qq.com/s?__biz=Mzg4MjU0OTM1OA==&mid=2247489258&idx=1&sn=f5109bb08a3234107f633c4cb6c5ef48&chksm=cf55a1ebf82228fdc7f15c9e5ab2440a46031e2b17b911027479057dbdf0fe53232e0aeda722&token=776659970&lang=zh_CN#rd) * [并发量太高,Nginx扛不住?这次我错怪Nginx了!!](https://mp.weixin.qq.com/s?__biz=Mzg4MjU0OTM1OA==&mid=2247489257&idx=1&sn=0c7e9651f21e56851607fe59d0887b44&chksm=cf55a1e8f82228fe8dc6b8742eeae4fce960f5f433419dec26e78f2dd1a5f3857b8429447f0f&token=776659970&lang=zh_CN#rd) * [如何实现Nginx的高可用负载均衡?肝了这篇我也会了!!](https://mp.weixin.qq.com/s?__biz=Mzg4MjU0OTM1OA==&mid=2247489255&idx=1&sn=178b2cf03cbb9c5cf2c48c95bf69ce39&chksm=cf55a1e6f82228f026b2a886012ebb10e393a8c7b2eec51ca197ee9ec8d359f59260ced69969&token=776659970&lang=zh_CN#rd) * [如何使用自签CA配置HTTPS加密反向代理访问?看了这篇我会了!!](https://mp.weixin.qq.com/s?__biz=Mzg4MjU0OTM1OA==&mid=2247489221&idx=1&sn=dc4cbda1f5910cecb1f6209cc75a509d&chksm=cf55a1c4f82228d201589f84eb860eaa0265432edc7ae7eba113218eede8bb0c93758c4f9fa2&token=776659970&lang=zh_CN#rd) * [如何搭建Nginx+Keepalived双机热备环境?这是最全的一篇了!!](https://mp.weixin.qq.com/s?__biz=Mzg4MjU0OTM1OA==&mid=2247489212&idx=1&sn=e6d17bfa0adc237c39f0cf56e4afdb9d&chksm=cf55a1bdf82228ab09f138a3e668cf673aad05f46b4e3a85d2f22697e9b6b5510a78074a727a&token=776659970&lang=zh_CN#rd) * [面试官:给我讲讲Nginx如何实现四层负载均衡?](https://mp.weixin.qq.com/s?__biz=Mzg4MjU0OTM1OA==&mid=2247489206&idx=1&sn=a77f5c7d4eb66598b3a57c9d74f348ef&chksm=cf55a1b7f82228a198342bd023dd0e7daf8d4cb71262522df40156b348e2be2803c663be920d&token=776659970&lang=zh_CN#rd) ### Spring注解系列 * [聊聊Spring注解驱动开发那些事儿!](https://mp.weixin.qq.com/s?__biz=Mzg4MjU0OTM1OA==&mid=2247489310&idx=1&sn=dccb43beea0aca68c5f14fabc1a8252e&chksm=cf55a01ff8222909f6c327a3cfa882258d10bcaf3a864aa3845a4ab0334dfa304a287861fc68&token=776659970&lang=zh_CN#rd) * [使用@Configuration和@Bean向容器中注册组件?看这一篇就够了!!](https://mp.weixin.qq.com/s?__biz=Mzg4MjU0OTM1OA==&mid=2247489309&idx=1&sn=0653cf09236644996db5209a0ab10776&chksm=cf55a01cf822290a16b13adbe46981535fc0b885f4b0e157f5fe24ceaa3547a11ed971dd1efa&token=776659970&lang=zh_CN#rd) * [使用@ComponentScan自动扫描组件并指定扫描规则](https://mp.weixin.qq.com/s?__biz=Mzg4MjU0OTM1OA==&mid=2247489308&idx=1&sn=bc479764e33c46724614036686a1b258&chksm=cf55a01df822290be561fb22e96ba97cdea980b6838ac0a739cfced52b09b93a153e64d5857c&token=776659970&lang=zh_CN#rd) * [自定义TypeFilter指定@ComponentScan注解的过滤规则](https://mp.weixin.qq.com/s?__biz=Mzg4MjU0OTM1OA==&mid=2247489307&idx=1&sn=66467a37a48b8784e9f0ea76a63d2e00&chksm=cf55a01af822290c0d85dba69159822eea5a460174b513121b022979dc0779749803e309a07f&token=776659970&lang=zh_CN#rd) * [使用@Scope注解设置组件的作用域](https://mp.weixin.qq.com/s?__biz=Mzg4MjU0OTM1OA==&mid=2247489306&idx=1&sn=45e9c6f14de9eeadf4153c8a0df4157f&chksm=cf55a01bf822290d73ed717b67fc0af081c4772598c113f465e2862bc2ec99605f267fd6ea16&token=776659970&lang=zh_CN#rd) * [不知道什么是懒加载?看这一篇就够了!!](https://mp.weixin.qq.com/s?__biz=Mzg4MjU0OTM1OA==&mid=2247489305&idx=1&sn=983c4e75e2ffd12d2653ab8013fac3bf&chksm=cf55a018f822290eecd1e85c5160430f702c85fc32de5a810ff04405a9de5557764f7c7d0bb3&token=776659970&lang=zh_CN#rd) * [如何按照条件向Spring容器中注册bean?这次我懂了!!](https://mp.weixin.qq.com/s?__biz=Mzg4MjU0OTM1OA==&mid=2247489304&idx=1&sn=4186e3b0208ac40ad53a25411bf985c5&chksm=cf55a019f822290ffeda65ed64267fa6dd37b1b688e153fc040322d8b8f92209703854b15f90&token=776659970&lang=zh_CN#rd) * [使用@Import注解给容器中快速导入一个组件](https://mp.weixin.qq.com/s?__biz=Mzg4MjU0OTM1OA==&mid=2247489303&idx=1&sn=b5292f66835dd95ca43eebd76201f935&chksm=cf55a016f82229004d1aa496cafcfc374eef1e8b2df795a292ef457c691592f8540147c4fd55&token=776659970&lang=zh_CN#rd) * [在@Import注解中使用ImportSelector接口导入bean](https://mp.weixin.qq.com/s?__biz=Mzg4MjU0OTM1OA==&mid=2247489302&idx=1&sn=50c871b0b0e4f52a9301f7ce37708df2&chksm=cf55a017f8222901e31555a8962fe59867459cf973a36c4baac01e0441db2cedb488cb5721ff&token=776659970&lang=zh_CN#rd) * [深入理解Spring的ImportSelector接口](https://mp.weixin.qq.com/s?__biz=Mzg4MjU0OTM1OA==&mid=2247489301&idx=1&sn=0a4488c88458d2211786c84e53a07a21&chksm=cf55a014f822290228ec67ae39a5228e92af88f70c9f94193ac59244efa612f8397bafd5743b&token=776659970&lang=zh_CN#rd) * [面试官:如何将Service注入到Servlet中?朋友又栽了!!](https://mp.weixin.qq.com/s?__biz=Mzg4MjU0OTM1OA==&mid=2247489299&idx=1&sn=dc9913a3e6a35aa4fe74b88aa4820996&chksm=cf55a012f8222904f6735e7459bd405d5cd2ea449371441f6b4be23dac89d05cf2a8feb1ea0a&token=776659970&lang=zh_CN#rd) * [在@Import中使用ImportBeanDefinitionRegistrar向容器中注册bean](https://mp.weixin.qq.com/s?__biz=Mzg4MjU0OTM1OA==&mid=2247489298&idx=1&sn=b686e50d0c27bf4e4cbd6be651613687&chksm=cf55a013f8222905f521513fce253872a9b476a008674a8b1d9b51ac09ea3a8bc5aa1d071883&token=776659970&lang=zh_CN#rd) * [面试官让我说说:如何使用FactoryBean向Spring容器中注册bean?](https://mp.weixin.qq.com/s?__biz=Mzg4MjU0OTM1OA==&mid=2247489297&idx=1&sn=7758415066c014d460f9b7994a26548d&chksm=cf55a010f82229064a6a85cbabb75137bbfe0ae5354a9ccbeed481a2c279e6427ca871c95415&token=776659970&lang=zh_CN#rd) * [如何使用@Bean注解指定初始化和销毁的方法?看这一篇就够了!!](https://mp.weixin.qq.com/s?__biz=Mzg4MjU0OTM1OA==&mid=2247489295&idx=1&sn=c5edfac2d7349b804ea9e03dae18d4cc&chksm=cf55a00ef8222918260acffdc5fdd3656d9fa532b503fe7c920140f1a9ece3fe32a665e634a7&token=776659970&lang=zh_CN#rd) * [Spring中的InitializingBean和DisposableBean,你真的了解吗?](https://mp.weixin.qq.com/s?__biz=Mzg4MjU0OTM1OA==&mid=2247489294&idx=1&sn=656cf69dfa8a7dc8817f93330e0cabc6&chksm=cf55a00ff8222919a17e5570bb59992f5df79116d313a21e65da52db6aaa12a1401c4c059c8d&token=776659970&lang=zh_CN#rd) * [你真的了解@PostConstruct注解和@PreDestroy注解吗?](https://mp.weixin.qq.com/s?__biz=Mzg4MjU0OTM1OA==&mid=2247489292&idx=1&sn=9df66f4ce0e983301114026855d65a6b&chksm=cf55a00df822291bd250c55eb3f2e74c6c750ee0d3af001b997d676e2308c9836fdc5b2b1a58&token=776659970&lang=zh_CN#rd) * [@PostConstruct与@PreDestroy源码的执行过程](https://mp.weixin.qq.com/s?__biz=Mzg4MjU0OTM1OA==&mid=2247489291&idx=1&sn=9f5a892a7bb5c695beda309f6873a0fa&chksm=cf55a00af822291c104db2efa426b784a097237529791edbf9d79f8a77de6f34b4848789d397&token=776659970&lang=zh_CN#rd) * [关于BeanPostProcessor后置处理器,你了解多少?](https://mp.weixin.qq.com/s?__biz=Mzg4MjU0OTM1OA==&mid=2247489290&idx=1&sn=7a2c6d0e960122871121b3339aa97eda&chksm=cf55a00bf822291d6ce498fa3254bc283b4732d0754df2a9094d5521c3a1b513acee9ae17737&token=776659970&lang=zh_CN#rd) * [针对bean的生命周期,我们究竟能做哪些工作?](https://mp.weixin.qq.com/s?__biz=Mzg4MjU0OTM1OA==&mid=2247489289&idx=1&sn=575cc9f1555bc929c8add21874a24133&chksm=cf55a008f822291e99621aa34793b2b3b598d75d224f1f96015d2c5b413875d3900f9b78abb1&token=776659970&lang=zh_CN#rd) * [面试官再问你BeanPostProcessor的执行流程,就把这篇文章甩给他!](https://mp.weixin.qq.com/s?__biz=Mzg4MjU0OTM1OA==&mid=2247489288&idx=1&sn=1539ba17e1be37437a33fcce6baa2c67&chksm=cf55a009f822291f6b068a2dcb7cbb3234a82f84cd29c2b1cc4f04a0ec7416bfd08b4a9c61ac&token=776659970&lang=zh_CN#rd) * [困扰了我很久的AOP嵌套调用终于解决了!](https://mp.weixin.qq.com/s?__biz=Mzg4MjU0OTM1OA==&mid=2247489286&idx=1&sn=2169252c0eaa130d8f79cba49e80b060&chksm=cf55a007f8222911826eba428a77eb5a3b40624a4cd7cb5fd26021c1fabe3c9245accdcea038&token=776659970&lang=zh_CN#rd) * [BeanPostProcessor在Spring底层是如何使用的?看完这篇我懂了!!](https://mp.weixin.qq.com/s?__biz=Mzg4MjU0OTM1OA==&mid=2247489282&idx=1&sn=07296f9b6ff2b36153f25e43a4769e0f&chksm=cf55a003f8222915e5eb75c15ccaab9e7f7f12c53a74c4ed828da90a86f6a1fc03b60969d005&token=776659970&lang=zh_CN#rd) * [如何使用@Value注解为bean的属性赋值,我们一起吊打面试官!](https://mp.weixin.qq.com/s?__biz=Mzg4MjU0OTM1OA==&mid=2247489281&idx=1&sn=d57cc587101e3fba1f176538042cc003&chksm=cf55a000f822291634780af1a4fc831bc5875d414ceee00cd0af47a4036e4478e39bbed99bf5&token=776659970&lang=zh_CN#rd) * [使用@PropertySource加载配置文件,我只看这一篇!!](https://mp.weixin.qq.com/s?__biz=Mzg4MjU0OTM1OA==&mid=2247489236&idx=1&sn=7bc6f84c959454ac54f82f4971b4e688&chksm=cf55a1d5f82228c3d04d83ed495bd8875039b7cac1f722694dadaf3ea225de7aebb41b11cf59&token=776659970&lang=zh_CN#rd) * [使用@Autowired@Qualifier@Primary三大注解自动装配组件,你会了吗?](https://mp.weixin.qq.com/s?__biz=Mzg4MjU0OTM1OA==&mid=2247489235&idx=1&sn=a219d46700547fa72ccd89c57cf4129f&chksm=cf55a1d2f82228c4796ec2d19138c7e3a9a5186ccc3967801ecd2481a9f18a6cf025f657bd02&token=776659970&lang=zh_CN#rd) * [你还不会使用@Resource和@Inject注解?那你就out了!!](https://mp.weixin.qq.com/s?__biz=Mzg4MjU0OTM1OA==&mid=2247489234&idx=1&sn=0fbd5246a090de06348791d77f115810&chksm=cf55a1d3f82228c5da95c44ffc3e8b306ee1edc6f08cbe2ac223f280808745413aa935d75175&token=776659970&lang=zh_CN#rd) * [如何实现方法、构造器位置的自动装配?我这样回答让面试官很满意!](https://mp.weixin.qq.com/s?__biz=Mzg4MjU0OTM1OA==&mid=2247489229&idx=1&sn=8180a35d131c8134226d951bb43b958b&chksm=cf55a1ccf82228dabc01bbb036f89177a3f06ea7c2820c0be24ec6f1f675f79df9534dd8e9bf&token=776659970&lang=zh_CN#rd) * [如何解决Spring的循环依赖问题?这道面试题助我拿到了Offer!!](https://mp.weixin.qq.com/s?__biz=Mzg4MjU0OTM1OA==&mid=2247489228&idx=1&sn=7aaeed8347d0af9fb2d302375c8863ad&chksm=cf55a1cdf82228db5f7b050fc6dada5b88dfd60db1103a8a101b5bb0435b0cd465e94287b00b&token=776659970&lang=zh_CN#rd) * [自定义组件如何注入Spring底层的组件?看了这篇我才真正理解了原理!!](https://mp.weixin.qq.com/s?__biz=Mzg4MjU0OTM1OA==&mid=2247489226&idx=1&sn=abab63bda0f1264b030fbf7ec41d52a6&chksm=cf55a1cbf82228dd55a462100fa6bc27e6d75fd898d49a985c8c61c9503ed64c04705368bcf8&token=776659970&lang=zh_CN#rd) * [如何实现多数据源读写分离?这是我看过最详细的一篇!!](https://mp.weixin.qq.com/s?__biz=Mzg4MjU0OTM1OA==&mid=2247489222&idx=1&sn=127337da3ff999c8e33ca3a36034a551&chksm=cf55a1c7f82228d18bd1f5fdf387e72b31f2d60cb9a84f8474b840f6132bb02293c695e8fa75&token=776659970&lang=zh_CN#rd) * [使用@Profile注解实现开发、测试和生产环境的配置和切换,看完这篇我彻底会了!!](https://mp.weixin.qq.com/s?__biz=Mzg4MjU0OTM1OA==&mid=2247489219&idx=1&sn=fa2b821de8613178afe23e1038fb76e7&chksm=cf55a1c2f82228d4024cf2e859a30b50806ecafab99e7c46579414fcd75ecfb0942486f5f0a7&token=776659970&lang=zh_CN#rd) * [你敢信?面试官竟然让我现场搭建一个AOP测试环境!](https://mp.weixin.qq.com/s?__biz=Mzg4MjU0OTM1OA==&mid=2247489211&idx=1&sn=bb6010e420915ed179391de3f70b86ae&chksm=cf55a1baf82228ac8e219cf98135b97d3e71259574610884de73549a195ef480aa7882750270&token=776659970&lang=zh_CN#rd) * [二狗子让我给他讲讲@EnableAspectJAutoProxy注解](https://mp.weixin.qq.com/s?__biz=Mzg4MjU0OTM1OA==&mid=2247489210&idx=1&sn=becc26b4b2d681007bfa52ce2448eed5&chksm=cf55a1bbf82228ada0bd72aec8670bf774918b7bbaa2613baa59d77008566400a75b7d5be6a9&token=776659970&lang=zh_CN#rd) * [AOP核心类源码解析,这是最全的一篇了!!](https://mp.weixin.qq.com/s?__biz=Mzg4MjU0OTM1OA==&mid=2247489209&idx=1&sn=bfb8b57184f15aa3407ed43baf76bfd1&chksm=cf55a1b8f82228ae396cb480789fe08866366e3f29f900dd23b3311e136524c311251c72308e&token=776659970&lang=zh_CN#rd) * [IOC容器注解汇总,你想要的都在这儿了!!(文末可留言)](https://mp.weixin.qq.com/s?__biz=Mzg4MjU0OTM1OA==&mid=2247489201&idx=1&sn=b82ac41138725d3ba1fce1b6b324fcf1&chksm=cf55a1b0f82228a654acca2742e62ceb13571215d051a22d1b9d03b9aab56523d22525a24705&token=776659970&lang=zh_CN#rd) * [为啥你用@JsonFormat注解反序列化LocalDateTime总失败?](https://mp.weixin.qq.com/s?__biz=Mzg4MjU0OTM1OA==&mid=2247489170&idx=1&sn=54fd5ea7592a614e83b73527b8b0fbc2&chksm=cf55a193f8222885628dec6bd4d8d5a840039b914e12361718916b6ff033d5415298e6db31c1&token=776659970&lang=zh_CN#rd) * [如何实现多数据源读写分离?这是我看过最详细的一篇!!](https://mp.weixin.qq.com/s?__biz=Mzg4MjU0OTM1OA==&mid=2247489112&idx=1&sn=b70f339a3a7c09e91d4d6c7b08db1f00&chksm=cf55a159f822284fd0bdb37a1e7603157486a02787c7f4d4b044a14197bb9806c4babaadd94b&token=776659970&lang=zh_CN#rd) * [一张图带你彻底理解Spring如何解决循环依赖!!](https://mp.weixin.qq.com/s?__biz=Mzg4MjU0OTM1OA==&mid=2247489109&idx=1&sn=e15dd6789f190868c8ee19046d1f676d&chksm=cf55a154f822284203bd64fd9ad23c38aa7d502780c3577cbfb12ac2c2e9160a21a251c65274&token=776659970&lang=zh_CN#rd) * [Spring AOP核心类解析,这是最全的一篇了!!](https://mp.weixin.qq.com/s?__biz=Mzg4MjU0OTM1OA==&mid=2247489102&idx=1&sn=b09b429c8e40863208dafa47f554e15f&chksm=cf55a14ff8222859fe8b31d32dcb7a722ef1c539befc5e093b64fdbb3da99fb22dd3bfc6ec35&token=776659970&lang=zh_CN#rd) * [小伙伴们在催更Spring系列,于是我汇总了Spring中的各项注解!!](https://mp.weixin.qq.com/s?__biz=Mzg4MjU0OTM1OA==&mid=2247491199&idx=1&sn=524a1942c88d19e0536a2b1a06c4d3de&chksm=cf55a97ef8222068e51544a288302363290dd95c4bab7bd78205ffd5784c838cbfa745183e6c&token=776659970&lang=zh_CN#rd) * [一张图带你窥探「Spring注解系列」到底要更新些啥?](https://mp.weixin.qq.com/s?__biz=Mzg4MjU0OTM1OA==&mid=2247491259&idx=1&sn=8bda41604df41c407d6d3b6820c5495f&chksm=cf55a9baf82220acb058117cf4c3b204d2f911839abf05258f6e3c55b25038c98d5337e2e7a5&token=776659970&lang=zh_CN#rd) * [AnnotationAwareAspectJAutoProxyCreator类是干嘛的?](https://mp.weixin.qq.com/s?__biz=Mzg4MjU0OTM1OA==&mid=2247491307&idx=1&sn=6eec4078f4eea8c265682784ab9a6b13&chksm=cf55a9eaf82220fcc54c4a15876335b2bc95b7ed6e3cc0524247379e401716cfdb166f404366&token=776659970&lang=zh_CN#rd) ### 设计模式系列 * [深入浅出Java 23种设计模式,最全PDF版本终于开放下载了!!面试必备!!(文末有福利)](https://mp.weixin.qq.com/s?__biz=Mzg4MjU0OTM1OA==&mid=2247489263&idx=1&sn=05a86037bf7e07fe6b8e4c8b89dcb5a0&chksm=cf55a1eef82228f8a79b0440852a00e39c5d3a880ace7c08ae2b29aa631ab9db87151ecc5ecd&token=776659970&lang=zh_CN#rd) * [设计模式汇总——你需要掌握的23种设计模式都在这儿了!](https://mp.weixin.qq.com/s?__biz=Mzg4MjU0OTM1OA==&mid=2247489408&idx=1&sn=84c48c9c3707691bd7709ebec66e82f1&chksm=cf55a081f8222997a7d2d80eba4f71cdab620c324b86a726f20c737738e608969e00f278a6b5&token=776659970&lang=zh_CN#rd) * [浅谈JAVA设计模式之——抽象工厂模式(AbstractFactoty)](https://mp.weixin.qq.com/s?__biz=Mzg4MjU0OTM1OA==&mid=2247489430&idx=1&sn=308eba7c5b9c53f61c45e58d5563d320&chksm=cf55a097f822298144844a7d59f0311f56a873f64aeb0ae80d096160fe42a345a6ab6aefd0be&token=776659970&lang=zh_CN#rd) * [浅谈JAVA设计模式之——工厂方法(FactoryMethod)](https://mp.weixin.qq.com/s?__biz=Mzg4MjU0OTM1OA==&mid=2247489429&idx=1&sn=31d7f6094a5c46d18f272c0bdf7ac086&chksm=cf55a094f822298295d82c0f26b82426fa6bbcb527666ab4dbd918e1fad88c385fab3111cb1b&token=776659970&lang=zh_CN#rd) * [浅谈JAVA设计模式之——桥接模式(Bridge)](https://mp.weixin.qq.com/s?__biz=Mzg4MjU0OTM1OA==&mid=2247489428&idx=1&sn=f1efabaaafedf1b18a0258f972a068c3&chksm=cf55a095f8222983d0a1a56c7c2493f6bffe4a815690ff75a4ef3ebe2559e8734e0ffa95be29&token=776659970&lang=zh_CN#rd) * [浅谈JAVA设计模式之——建造者模式(Builder)](https://mp.weixin.qq.com/s?__biz=Mzg4MjU0OTM1OA==&mid=2247489427&idx=1&sn=cac4feff8a4c93ea9d593836d1c5f6a5&chksm=cf55a092f8222984c40b7d0e058fe447e8a554584ebb324f3b90ffb6d6784979edfe1f6bb12d&token=776659970&lang=zh_CN#rd) * [浅谈JAVA设计模式之——组合模式(Composite)](https://mp.weixin.qq.com/s?__biz=Mzg4MjU0OTM1OA==&mid=2247489426&idx=1&sn=5bda11d90dc3feaac9c30f78c3090eef&chksm=cf55a093f822298581fe6ec8dcbe4ec0e36db42d5d53fee89d30fb6bbe4cf2c14a1f9c70ee74&token=776659970&lang=zh_CN#rd) * [浅谈JAVA设计模式之——原型模式(Prototype)](https://mp.weixin.qq.com/s?__biz=Mzg4MjU0OTM1OA==&mid=2247489425&idx=1&sn=e48b3e35131435375217f51ac4a7f7aa&chksm=cf55a090f8222986b2c94a2b87b15ef15c555ab8e936522a40cd880f30ba39b2dae461a0c960&token=776659970&lang=zh_CN#rd) * [浅谈JAVA设计模式之——适配器模式(Adapter)](https://mp.weixin.qq.com/s?__biz=Mzg4MjU0OTM1OA==&mid=2247489424&idx=1&sn=d61288d307b30a8417cf0afa0c487816&chksm=cf55a091f82229877f9b78d45676553042668f5da842f517eceeb9bd5a01c213d65caeb779a9&token=776659970&lang=zh_CN#rd) * [浅谈JAVA设计模式之——装饰模式(Decorator)](https://mp.weixin.qq.com/s?__biz=Mzg4MjU0OTM1OA==&mid=2247489423&idx=1&sn=d1f747095fb9a6674134e06930a1767c&chksm=cf55a08ef8222998d98c9cb55783e59c45d7085260b2389906af286f3bd9ad49c82bffc47e50&token=776659970&lang=zh_CN#rd) * [浅谈JAVA设计模式之——外观模式(Facade)](https://mp.weixin.qq.com/s?__biz=Mzg4MjU0OTM1OA==&mid=2247489421&idx=1&sn=2a1612d1ec75f82a035c64dca2dd95f7&chksm=cf55a08cf822299a3c5169c8b14e3d3368080957f747b400ee3b02dae0bfb036b7d8f7f12b8d&token=776659970&lang=zh_CN#rd) * [浅谈JAVA设计模式之——享元模式(Flyweight)](https://mp.weixin.qq.com/s?__biz=Mzg4MjU0OTM1OA==&mid=2247489420&idx=1&sn=c9b4c358b69b2bccc24b26b8615d1739&chksm=cf55a08df822299b52656b120e57d6c933098a6d9d9f29e65a5bee64fddaf7f26a78d39a1ed9&token=776659970&lang=zh_CN#rd) * [浅谈JAVA设计模式之——代理模式(proxy)](https://mp.weixin.qq.com/s?__biz=Mzg4MjU0OTM1OA==&mid=2247489419&idx=1&sn=572121286c4bd29b172fd858ed1f3f4f&chksm=cf55a08af822299cb9ee19f9e936c7b072ada879d4a8b820d7e367b97e8b93b67b4adf161429&token=776659970&lang=zh_CN#rd) * [浅谈JAVA设计模式之——责任链模式(COR)](https://mp.weixin.qq.com/s?__biz=Mzg4MjU0OTM1OA==&mid=2247489418&idx=1&sn=2ba85330aaca7b0f32ef1d96d5c7e7ef&chksm=cf55a08bf822299d29f0812b7a2fe8ac81be2bc5024b12f538265befd955f0a54fa6114b8cca&token=776659970&lang=zh_CN#rd) * [浅谈JAVA设计模式之——命令模式(Command)](https://mp.weixin.qq.com/s?__biz=Mzg4MjU0OTM1OA==&mid=2247489417&idx=1&sn=83b646905e3774ef2af7fc9d7e819a03&chksm=cf55a088f822299e16b2b15054d45359882a13cad13943358c32dcdc6691f8a15b3fa3f67c49&token=776659970&lang=zh_CN#rd) * [浅谈JAVA设计模式之——解析器模式(Interpreter)](https://mp.weixin.qq.com/s?__biz=Mzg4MjU0OTM1OA==&mid=2247489416&idx=1&sn=1e58a234a9e75d88be16cec587d32018&chksm=cf55a089f822299fe20aa89c4b02ed434ebde4205b0935fb7dab9d3e0c378ebc2acee83d03dc&token=776659970&lang=zh_CN#rd) * [浅谈JAVA设计模式之——迭代器模式(interator)](https://mp.weixin.qq.com/s?__biz=Mzg4MjU0OTM1OA==&mid=2247489415&idx=1&sn=9640f75d501ac962443a0a99c58addae&chksm=cf55a086f82229909a702e48c291c68c73f93dd416c02530be835fde7c77cbafb7773b6897d6&token=776659970&lang=zh_CN#rd) * [浅谈JAVA设计模式之——中介者模式(Mediator)](https://mp.weixin.qq.com/s?__biz=Mzg4MjU0OTM1OA==&mid=2247489414&idx=1&sn=55cc6052e3a4485d140504a1f8a1b176&chksm=cf55a087f8222991ee211cfeb4287eef3159e10f8d36b82725f1271b36b48e6f2155b9d1d28f&token=776659970&lang=zh_CN#rd) * [浅谈JAVA设计模式之——备忘录模式(Memento)](https://mp.weixin.qq.com/s?__biz=Mzg4MjU0OTM1OA==&mid=2247489413&idx=1&sn=1fb01653794d74b511845dc51cd64fda&chksm=cf55a084f82229927641a6db61ae4ec6b53577ce58634a6284bc51107d6fb30956dda5fa6fcb&token=776659970&lang=zh_CN#rd) * [浅谈JAVA设计模式之——观察者模式(Observer)](https://mp.weixin.qq.com/s?__biz=Mzg4MjU0OTM1OA==&mid=2247489412&idx=1&sn=53d91435feb3fd029b9bc2628ffb4240&chksm=cf55a085f82229937006fa35ee90f48910cede4c7a79a64dc890da7513a5d48ece68950fa148&token=776659970&lang=zh_CN#rd) * [浅谈JAVA设计模式之——状态模式(State)](https://mp.weixin.qq.com/s?__biz=Mzg4MjU0OTM1OA==&mid=2247489411&idx=1&sn=2674d51ae141e39049303026ad2a5942&chksm=cf55a082f8222994a9ce4f99ddda574a42ca1625fa22be83a2b8eca853beb041c65810ddf9a5&token=776659970&lang=zh_CN#rd) * [浅谈JAVA设计模式之——策略模式(Strategy)](https://mp.weixin.qq.com/s?__biz=Mzg4MjU0OTM1OA==&mid=2247489410&idx=1&sn=f24282a37d2d2bde8d43ca9d19757826&chksm=cf55a083f822299582a45b51e6a48c54eaad0f2436e3a1972444c480836b1faad0f077f6bb48&token=776659970&lang=zh_CN#rd) * [浅谈JAVA设计模式之——模板方法(TemplateMethod)](https://mp.weixin.qq.com/s?__biz=Mzg4MjU0OTM1OA==&mid=2247489409&idx=3&sn=3c1c0c2595733a760c8861cc19f5dc9c&chksm=cf55a080f8222996a5d907a92b9e29081244706636397962c34fef6d354f615c79922ad03b66&token=776659970&lang=zh_CN#rd) * [浅谈JAVA设计模式之——访问者模式(Visitor)](https://mp.weixin.qq.com/s?__biz=Mzg4MjU0OTM1OA==&mid=2247489409&idx=2&sn=e6b07bc483f6a6b73e7a5b96ffd4d67a&chksm=cf55a080f8222996e6fcdd8d92f2056a2d6fe92aa7c47ecde5695d66c24421792bcb2e43b3b9&token=776659970&lang=zh_CN#rd) * [浅谈JAVA设计模式之——单例模式(Singleton)](https://mp.weixin.qq.com/s?__biz=Mzg4MjU0OTM1OA==&mid=2247489409&idx=1&sn=60303bea6ab171a47d01de3333fec3b2&chksm=cf55a080f82229963e3e9460960dca4d3f4e8ce23960cd7e280febbe50d5db28a1c97cf20280&token=776659970&lang=zh_CN#rd) ### Kafka系列 * [搞懂Kafka的这个问题,你离大厂就不远了!](https://mp.weixin.qq.com/s?__biz=Mzg4MjU0OTM1OA==&mid=2247494670&idx=1&sn=8012de632d04b38bb70064baeb6c07ec&chksm=cf565b0ff821d219055d25f41c85d11ad4edcbb25250840f95ae68ad2c4d4d8b6c7cafcf8d4d&token=1772864567&lang=zh_CN#rd) * [Kafka又出问题了!](https://mp.weixin.qq.com/s?__biz=Mzg4MjU0OTM1OA==&mid=2247491002&idx=1&sn=1079a5e6ad9bc4cc8415d1b0c6b05083&chksm=cf55aabbf82223ad91dd208261d09301b1f5ddd165049299577a55c13161c6c431076256bf7b&token=776659970&lang=zh_CN#rd) ### 算法专题 * [我整理了50道经典Java算法题,直接进了字节跳动!!](https://mp.weixin.qq.com/s?__biz=Mzg4MjU0OTM1OA==&mid=2247489162&idx=1&sn=bd2fdce236e4737525445018e3151921&chksm=cf55a18bf822289d154547cb62aa95f5b99f398c32b4013f793cc60089f3444ee1541eba0732&token=776659970&lang=zh_CN#rd) ### Dubbo系列 * [冰河开始对Dubbo下手了!](https://mp.weixin.qq.com/s?__biz=Mzg4MjU0OTM1OA==&mid=2247489794&idx=1&sn=aa9efb732ac7fe3d2bc927c1491cae57&chksm=cf55ae03f82227157bb69cb487df2081b5b623d173b1df603be66347692acec56d455da6aae1&token=776659970&lang=zh_CN#rd) * [俯瞰Dubbo全局,阅读源码前必须掌握这些!!](https://mp.weixin.qq.com/s?__biz=Mzg4MjU0OTM1OA==&mid=2247489831&idx=1&sn=e1126303a91f8fccb6154196695c5e6e&chksm=cf55ae26f82227302f71ced14c0aa312f631c8f03df65650dd3a49dc7fa9f339685342c38f5a&token=776659970&lang=zh_CN#rd) * [我是如何在短期内快速掌握Dubbo的原理和源码的(纯干货)?](https://mp.weixin.qq.com/s?__biz=Mzg4MjU0OTM1OA==&mid=2247490192&idx=1&sn=afedf86349f76c864f42b2b48a245c62&chksm=cf55ad91f8222487088c333cb7160dd63496f9754c5533dbc868d9ac875101cc67dc1483c0da&token=776659970&lang=zh_CN#rd) * [Dubbo中的统一契约是如何实现的?](https://mp.weixin.qq.com/s?__biz=Mzg4MjU0OTM1OA==&mid=2247490352&idx=1&sn=fce0061cd1f33d0baa1c1cad81ea1b69&chksm=cf55ac31f82225272167f0e7a3ce8914e4efb3d7a3e650ed16af386146183b13b1cac08f5107&token=776659970&lang=zh_CN#rd) * [结合实战和源码来聊聊Java中的SPI机制?](https://mp.weixin.qq.com/s?__biz=Mzg4MjU0OTM1OA==&mid=2247489116&idx=1&sn=dd764beecc93368365b7a0393a78bc6f&chksm=cf55a15df822284b8543bc66e490af18240bc989c26281ff3b3c8234c8cf7a24fb95d0beeb4d&token=776659970&lang=zh_CN#rd) * [你了解过Java的SPI机制吗?再不了解你就Out了!](https://mp.weixin.qq.com/s?__biz=Mzg4MjU0OTM1OA==&mid=2247489460&idx=1&sn=ef7424f16b27379441b6c32db49bd478&chksm=cf55a0b5f82229a3bdeb54823b7a46ce449448f6fb4d19e565e689b946d389aeff190b309c53&token=776659970&lang=zh_CN#rd) * [改造低版本Dubbo,使其兼容Spring4或Spring5注解配置](https://mp.weixin.qq.com/s?__biz=Mzg4MjU0OTM1OA==&mid=2247489451&idx=1&sn=8443aad6127a1a08d5c9ada01ad354fb&chksm=cf55a0aaf82229bc969e66627b77fd5a17459967ba4345d72110de9da1bc66eef9b94acec41a&token=776659970&lang=zh_CN#rd) * [低版本Dubbo与高版本Spring集成](https://mp.weixin.qq.com/s?__biz=Mzg4MjU0OTM1OA==&mid=2247489450&idx=1&sn=e0507e7990a026059baea4a41020be8f&chksm=cf55a0abf82229bd4a29ef8c5a920286fd1cfd1c5d0e4cab702a98eb6634aacaefc2b6cf89c5&token=776659970&lang=zh_CN#rd) * [你知道如何将Dubbo服务打包成Jar文件吗?这篇文章帮你搞定!!](https://mp.weixin.qq.com/s?__biz=Mzg4MjU0OTM1OA==&mid=2247489449&idx=1&sn=e28368150114a2a2059cde72d6d74cbb&chksm=cf55a0a8f82229be914bb0c88b422b5fb47bb3c29bff53dd67f8a1ff71d8672b72e4d2414e75&token=776659970&lang=zh_CN#rd) * [使用Shell脚本维护Dubbo服务](https://mp.weixin.qq.com/s?__biz=Mzg4MjU0OTM1OA==&mid=2247489448&idx=1&sn=583eae9b2c29492d690143aeb9b96575&chksm=cf55a0a9f82229bfea3cedb75e7eee3292d89a61e1367ffaf0698897e131d2ecdada71bfe5bf&token=776659970&lang=zh_CN#rd) * [监控中心的介绍与简易监控中心的安装](https://mp.weixin.qq.com/s?__biz=Mzg4MjU0OTM1OA==&mid=2247489447&idx=1&sn=03985c758e3ab14c698a73a2d054b9ee&chksm=cf55a0a6f82229b0c1002612b1f07604b83d5e5ecc4db5bd172906ee476193fa63f5bc9be1f8&token=776659970&lang=zh_CN#rd) * [Dubbo 管理控制台的安装](https://mp.weixin.qq.com/s?__biz=Mzg4MjU0OTM1OA==&mid=2247489446&idx=1&sn=1280e760a0d3a90bcf4fb1cb5e11b149&chksm=cf55a0a7f82229b13cf8f3de46e9134673c55b9d3d9198128b36a6bb6598fab96cdf6ac7109e&token=776659970&lang=zh_CN#rd) * [如何实现Dubbo服务的集群](https://mp.weixin.qq.com/s?__biz=Mzg4MjU0OTM1OA==&mid=2247489445&idx=1&sn=602d31050340f353e8d3443c86d1f5cd&chksm=cf55a0a4f82229b21e232e3f7cbe0178917d98c32dcb7a30ef16f85656badb4a0d96f0cd8dcb&token=776659970&lang=zh_CN#rd) * [分布式服务子系统的划分](https://mp.weixin.qq.com/s?__biz=Mzg4MjU0OTM1OA==&mid=2247489444&idx=1&sn=cb419ca548134d0adcf64b839a6e27b4&chksm=cf55a0a5f82229b34efd3fe83308015791e77ede9c735a9880638996c59160fe6b10f1804a3d&token=776659970&lang=zh_CN#rd) * [服务接口的设计原则](https://mp.weixin.qq.com/s?__biz=Mzg4MjU0OTM1OA==&mid=2247489443&idx=1&sn=eb8c0c2e1f498f0f98f25ad0a440d4cb&chksm=cf55a0a2f82229b49809a8a3fd776e76fe8165b4e8ecd7df48a9d8c6fd797f9a3c111b926220&token=776659970&lang=zh_CN#rd) * [几种负载均衡算法](https://mp.weixin.qq.com/s?__biz=Mzg4MjU0OTM1OA==&mid=2247489442&idx=1&sn=e2d06304a682329ead315ee127621d34&chksm=cf55a0a3f82229b5658bff25091353eecd3b68374ba5cf16e2b9e80059ae24b88ee4a3fc743a&token=776659970&lang=zh_CN#rd) * [线程模型(Linux 用户线程数限制导致的 java.lang.OutOfMemoryError)](https://mp.weixin.qq.com/s?__biz=Mzg4MjU0OTM1OA==&mid=2247489440&idx=1&sn=0057edbfaeebbfed4c30645181792742&chksm=cf55a0a1f82229b7bda7097262bd50cecaa89bdbf9de097272d857461fbd8ea66bba9beea5af&token=776659970&lang=zh_CN#rd) * [你知道如何使用Dubbo消费端直连服务提供者进行开发和调试吗??](https://mp.weixin.qq.com/s?__biz=Mzg4MjU0OTM1OA==&mid=2247489439&idx=1&sn=6f3607682b5859c3cc9ec41fcee1b19e&chksm=cf55a09ef822298898f6d5ccfeadf4f5bcb087f210956c5b6da90eac6c06d5952f3d8c820710&token=776659970&lang=zh_CN#rd) * [服务只订阅(开发调试)](https://mp.weixin.qq.com/s?__biz=Mzg4MjU0OTM1OA==&mid=2247489438&idx=1&sn=98ea263476635db10ca8ac8785440de4&chksm=cf55a09ff82229895f5e5fc42f6ac734e03c3742f93b7d075b2144981db1c0cc16c347616e93&token=776659970&lang=zh_CN#rd) * [Dubbo Filter实战](https://mp.weixin.qq.com/s?__biz=Mzg4MjU0OTM1OA==&mid=2247489437&idx=1&sn=078b6726b051f1a07b54223a056a3d2f&chksm=cf55a09cf822298a25e08248e7c4052da1ec89045cd459f909d5742050c4520823b5b669a75b&token=776659970&lang=zh_CN#rd) ### MyBatis源码系列 * [三天撸完了MyBatis,各位随便问!!](https://mp.weixin.qq.com/s?__biz=Mzg4MjU0OTM1OA==&mid=2247494053&idx=1&sn=643c4fcd2502d54a4c90d25dfb600f43&chksm=cf565ea4f821d7b27672a7ebc1c3350ad9bffa3b10677856399236094efa64835f5ab754c2b6&token=1929495520&lang=zh_CN#rd) * [三万字带你彻底吃透MyBatis源码!!](https://mp.weixin.qq.com/s?__biz=Mzg4MjU0OTM1OA==&mid=2247489777&idx=1&sn=470f2d8277fe7d23c12a99de7953bef1&chksm=cf55aff0f82226e61991ebcab6c31adf28d3f86606937fcb6528dfae2e302eae0bc9fca7c18c&token=776659970&lang=zh_CN#rd) * [构建源码分析测试用例](https://mp.weixin.qq.com/s?__biz=Mzg4MjU0OTM1OA==&mid=2247489436&idx=1&sn=5eae962c50043205c4b29b82cb6fcb7b&chksm=cf55a09df822298b40fc9d6de1f29bb881d532724918327ad681531b072c7960922fd360a9be&token=776659970&lang=zh_CN#rd) * [配置解析创建SqlSessionFactory的过程](https://mp.weixin.qq.com/s?__biz=Mzg4MjU0OTM1OA==&mid=2247489435&idx=1&sn=def1275d8688259d50771b284cd0a1ea&chksm=cf55a09af822298c793df1877a4ecfb1f5778359c12c8659a360476828ee2ac6efc2c0103ea9&token=776659970&lang=zh_CN#rd) * [SqlSession会话创建过程](https://mp.weixin.qq.com/s?__biz=Mzg4MjU0OTM1OA==&mid=2247489434&idx=1&sn=970fdb1b8655f844639445b621742b90&chksm=cf55a09bf822298d43784113fda8f19c4649464fea61f9502282dbed0145b707c3d57915a5a2&token=776659970&lang=zh_CN#rd) * [获得Mapper对象](https://mp.weixin.qq.com/s?__biz=Mzg4MjU0OTM1OA==&mid=2247489433&idx=1&sn=df29c3c6d8a4dcbf665478dcc40b2259&chksm=cf55a098f822298edadd4d28f2a4b16bc95bea6be05975eb44354be4b880b5d88b62a781de51&token=776659970&lang=zh_CN#rd) * [执行SQL语句的过程](https://mp.weixin.qq.com/s?__biz=Mzg4MjU0OTM1OA==&mid=2247489432&idx=1&sn=e4acf817b65685a39d8c3320d3f8e07c&chksm=cf55a099f822298f248f405b593ca01a20ddaf02ca3108b1972158a4214e4f3b5eda809b8561&token=776659970&lang=zh_CN#rd) * [面试官问我如何获得Mapper对象?我是这样说的!](https://mp.weixin.qq.com/s?__biz=Mzg4MjU0OTM1OA==&mid=2247489287&idx=1&sn=6d6d2864ad4ea6213cd2657a50619c9b&chksm=cf55a006f822291039a7edac9004dc3c567e2463474df9b63a3cea89db6df1e489b8db577274&token=776659970&lang=zh_CN#rd) ### 网路编程 * [域名是如何绑定动态IP的?](https://mp.weixin.qq.com/s?__biz=Mzg4MjU0OTM1OA==&mid=2247498618&idx=1&sn=9d18f37b6b2c04271be655efbc6785e4&chksm=cf564c7bf821c56d47892227b5f42b26ce5f4a341f516cf365a93dba6761ebf5168632c3d2e1&token=2138570566&lang=zh_CN#rd) * [一分钟解决Github连接慢或者无法连接,亲测有效!](https://mp.weixin.qq.com/s?__biz=Mzg4MjU0OTM1OA==&mid=2247497846&idx=1&sn=1b37541985e634583265f3e2d2ff672c&chksm=cf564f77f821c66189d7334b71bd7543aab96387bcf806dcd0af4452cf1ffaa8015235eaeb6c&token=2138570566&lang=zh_CN#rd) * [卧槽,VPN又断开了!!](https://mp.weixin.qq.com/s?__biz=Mzg4MjU0OTM1OA==&mid=2247493964&idx=1&sn=25dd1e214ed00e2503bfe32aa91dd4d1&chksm=cf565e4df821d75b5976b0f0b4df1e6e8511851a6a4ccebcbc1251310171fef282c8bcdf7291&token=1929495520&lang=zh_CN#rd) * [当年我凭啥能拿下月薪40K的信息安全职位?](https://mp.weixin.qq.com/s?__biz=Mzg4MjU0OTM1OA==&mid=2247493053&idx=1&sn=8c372106b7d569563817624a2b01ad62&chksm=cf5652bcf821dbaa6249c17ce127fcfcfd77942aa1b27d266860e009055c98d7fc06786b2be5&token=1352513703&lang=zh_CN#rd) * [万字长文带你图解计算机网络(超全)!!](https://mp.weixin.qq.com/s?__biz=Mzg4MjU0OTM1OA==&mid=2247490922&idx=1&sn=52b2aafed859e5ac14db6115e3e1a7fe&chksm=cf55aa6bf822237d95b0e9409c5f3b517fb06db8206dd001524c38e506048b07c2fefb0d195f&token=776659970&lang=zh_CN#rd) * [域名还能绑定动态IP?看完又涨知识了!!](https://mp.weixin.qq.com/s?__biz=Mzg4MjU0OTM1OA==&mid=2247489159&idx=1&sn=c3bf154d6f76f867610ef0c802bf48cb&chksm=cf55a186f822289043d5aa9c0a999c6557c976de7a12b7173a457b957dbd7bc0597d16ae3307&token=776659970&lang=zh_CN#rd) * [面试官:讲讲七层网络模型与TCP三次握手与四次断开?](https://mp.weixin.qq.com/s?__biz=Mzg4MjU0OTM1OA==&mid=2247489146&idx=1&sn=44d502461973c72a418d5597fc9c434b&chksm=cf55a17bf822286d5daf2f7fea9832254d34830df067ef2fff588eadf68151b3e79097a8613f&token=776659970&lang=zh_CN#rd) ### 深度思考 * [程序员如何让自己更快的废掉?](https://mp.weixin.qq.com/s?__biz=Mzg4MjU0OTM1OA==&mid=2247489844&idx=1&sn=30bbf9c5e4ab1ea59029a01738fc386d&chksm=cf55ae35f82227231f51b4dcc68b90b2270b4310f4d7960d1cce3c64fe755e4de7c36b824136&token=776659970&lang=zh_CN#rd) * [真正牛逼的人,都是极简主义者!!](https://mp.weixin.qq.com/s?__biz=Mzg4MjU0OTM1OA==&mid=2247490442&idx=1&sn=f0fc5a95f429e75f1c5726867152ebd6&chksm=cf55ac8bf822259d639a1ee405167a8935cfa0c35c46f7f3c8c2622b6659c224f9d930cc7370&token=776659970&lang=zh_CN#rd) * [在一个执行力极差的团队工作是一种怎样的体验?](https://mp.weixin.qq.com/s?__biz=Mzg4MjU0OTM1OA==&mid=2247490758&idx=1&sn=ebde7336db83320f0886f716b327687d&chksm=cf55abc7f82222d1fb55be7b71be851a2e67d1410dcf3c0fa094da95c1ed7761d102c7ae0f0a&token=776659970&lang=zh_CN#rd) * [年薪千万的产品经理打开了我对这个职位的新认知!](https://mp.weixin.qq.com/s?__biz=Mzg4MjU0OTM1OA==&mid=2247490782&idx=1&sn=8008870f8f460672ad6ad8e763f46815&chksm=cf55abdff82222c94a86cff432a19102d03640617f8806afd8cf64b58f684a4c39ed3c0a7ac4&token=776659970&lang=zh_CN#rd) * [深度思考:无关生智,局外生慧!](https://mp.weixin.qq.com/s?__biz=Mzg4MjU0OTM1OA==&mid=2247490856&idx=1&sn=ab1808517ce00ff189e1c7860a0ccaf9&chksm=cf55aa29f822233f3f226b9e7266456f7906f36569e6fbd4c9598f82179f6d6529ff30bd4a9b&token=776659970&lang=zh_CN#rd) * [如何成为一名拖垮整个团队的产品经理?](https://mp.weixin.qq.com/s?__biz=Mzg4MjU0OTM1OA==&mid=2247491156&idx=1&sn=ac80e9cdf4b5822c03cbad700e6d8d55&chksm=cf55a955f8222043cf4755d07a27f52eb62cb5cfd021980dca31189f77eba2ac401cb3f96c37&token=776659970&lang=zh_CN#rd) ### 程序人生 * [冰河被邀请成为TVP腾讯云最具价值专家啦](https://mp.weixin.qq.com/s/k-t1CfLs70-wFu-3mZTuHA) * [2023,新的一年,新的规划!(文末有福利)](https://mp.weixin.qq.com/s/wGwpkZ4Rk4spfP5K8tseTQ) * [请给原作者和原创一点尊重!](https://mp.weixin.qq.com/s/1zSApb4QKN9dDbAj9661ZA) * [2022年,大部分人半途而废的原因](https://mp.weixin.qq.com/s/TQAKNnMI40oDEOV-KU6pVg) * [台湾同胞们也看上咱出版的这本书啦!](https://mp.weixin.qq.com/s?__biz=Mzg4MjU0OTM1OA==&mid=2247503017&idx=1&sn=6670f15d1939257c189846d59dc7f358&chksm=cf567ba8f821f2beeabcce3ca935c04e1b4069e55ea2e19308369e6231c1b5b52c9de079c12a&token=1644289145&lang=zh_CN#rd) * [离职了!!](https://mp.weixin.qq.com/s?__biz=Mzg4MjU0OTM1OA==&mid=2247498758&idx=1&sn=f5d85fe6a8ae469f710908392f6cca1f&chksm=cf564b07f821c21177a7bf172f7b2f53515b9c63b41e686f013a24e86b24bda730adcf4b28cf&token=2138570566&lang=zh_CN#rd) * [新年 Flag!!](https://mp.weixin.qq.com/s?__biz=Mzg4MjU0OTM1OA==&mid=2247498921&idx=1&sn=cb57113d8590fcc44b8b6d6fe67caeb7&chksm=cf564ba8f821c2be051606fa849d05fb95c66679e97aacc6a912ab3860512e1a1316385988f1&token=2138570566&lang=zh_CN#rd) * [年已过半,年终立的Flag实现了几个?](https://mp.weixin.qq.com/s?__biz=Mzg4MjU0OTM1OA==&mid=2247502561&idx=1&sn=3200b8f442469c85ffa5f17e6e0da882&chksm=cf567de0f821f4f66f477b3e7ccd6dc6e5ba457b7bb66ec05c5f1aa64d7e6d0bf1f10c8ec3ef&token=2138570566&lang=zh_CN#rd) * [对着灯泡发呆:2022年我要干点啥?](https://mp.weixin.qq.com/s?__biz=Mzg4MjU0OTM1OA==&mid=2247499376&idx=1&sn=999829aa1168842597f0b16e51abc300&chksm=cf564971f821c067d58de32eced4da7c350f5a3ad80c47fa4518c8aa6dcbc30ca88267f27fbc&token=2138570566&lang=zh_CN#rd) * [冰河十年前的预测如今被阿里实现了!!](https://mp.weixin.qq.com/s?__biz=Mzg4MjU0OTM1OA==&mid=2247499326&idx=1&sn=7984a35462a1d664ea3134954589b07f&chksm=cf56493ff821c02983d959ad131a7f182cd5030d24a0437567f26561cb3326f42880fc31ae3a&token=2138570566&lang=zh_CN#rd) * [冰河写的这本书在阿里巴巴火了!!](https://mp.weixin.qq.com/s?__biz=Mzg4MjU0OTM1OA==&mid=2247498952&idx=1&sn=4c585cd34e7145fa29d62b5df1e12c5f&chksm=cf564bc9f821c2dfd05d3db88cfae52cb60e8eb1ee4dd85e5b91cafc7ef5f37cd2c9ebaabf90&token=2138570566&lang=zh_CN#rd) * [这个月,4本书上架阿里藏经阁!](https://mp.weixin.qq.com/s?__biz=Mzg4MjU0OTM1OA==&mid=2247498919&idx=1&sn=067fb09ae6143ebcb78454b25a12e3ec&chksm=cf564ba6f821c2b074b57c79cea6c0f35424c6d64a97f89d4614710dd72e3eb8a756b4bd3748&token=2138570566&lang=zh_CN#rd) * [冰河最新电子书上架第一天阅读超15万!](https://mp.weixin.qq.com/s?__biz=Mzg4MjU0OTM1OA==&mid=2247498685&idx=1&sn=b96cfb7e32ed9fd3a5c0614c10175901&chksm=cf564cbcf821c5aa9bee21d05b5399ef22e21adf7e2650073c3b288d7bdfe08e8b00b544c698&token=2138570566&lang=zh_CN#rd) * [B站来袭!!](https://mp.weixin.qq.com/s?__biz=Mzg4MjU0OTM1OA==&mid=2247498662&idx=1&sn=f07ef67769b1beeee24bf41ace71100f&chksm=cf564ca7f821c5b1fdd85fd3bffc61b83e69cc018f34bf2107d8667f2fd7fb1103546b981e61&token=2138570566&lang=zh_CN#rd) * [Spring Cloud 突发重大漏洞!!](https://mp.weixin.qq.com/s?__biz=Mzg4MjU0OTM1OA==&mid=2247499908&idx=1&sn=54c792a0519eff9d7ae1e600805aeef7&chksm=cf567785f821fe93d9abc40abe3a01306ba97b53069e59b4a685a3a76698d9d2c437b46caee1&token=2138570566&lang=zh_CN#rd) * [开源的文章又被培训机构“BP”了,这次不忍了](https://mp.weixin.qq.com/s?__biz=Mzg4MjU0OTM1OA==&mid=2247500122&idx=1&sn=480c946ef5b94b86d146ab8aaffc439b&chksm=cf56765bf821ff4d4f266ea73103d17f4509c3a13102beb90b59affc9470eaae90a68362a14f&token=2138570566&lang=zh_CN#rd) * [冰河写的文章开源了!!](https://mp.weixin.qq.com/s?__biz=Mzg4MjU0OTM1OA==&mid=2247498274&idx=1&sn=1d9f925a1df8f5a0abd5ca458934071d&chksm=cf564d23f821c435f9fe675aecce576da6aba30d8f77af636a0dd41f976150ac24107f79ddb9&token=2138570566&lang=zh_CN#rd) * [放假了吗?不,你没有假期的!!!](https://mp.weixin.qq.com/s?__biz=Mzg4MjU0OTM1OA==&mid=2247501430&idx=1&sn=30c52253266ca68ef313a28a0626f115&chksm=cf567177f821f8618305167308df80b296818f572095b6c395a023e8f446716c972900fd52f7&token=2138570566&lang=zh_CN#rd) * [放假第二天的重大事故:忘记服务器root密码了!](https://mp.weixin.qq.com/s?__biz=Mzg4MjU0OTM1OA==&mid=2247501464&idx=1&sn=6c369f3da796499aa54508437fadbe11&chksm=cf567199f821f88fbfb7fe7074cfe6b5e223f1779b57fccea4fd0c51bfb375c5b48f855346f0&token=2138570566&lang=zh_CN#rd) * [冰河在大学是怎样度过的?](https://mp.weixin.qq.com/s?__biz=Mzg4MjU0OTM1OA==&mid=2247497986&idx=1&sn=041e4ff2c1dfe235ed1da60cee930459&chksm=cf564e03f821c71521ed834533fc4903cbd4d10d02222c9853047ca0388b520c649798de6271&token=2138570566&lang=zh_CN#rd) * [冰河作客开源中国“高手问答”栏目了!!](https://mp.weixin.qq.com/s?__biz=Mzg4MjU0OTM1OA==&mid=2247497953&idx=1&sn=03b0377302d0ea619519b800390cbe94&chksm=cf564fe0f821c6f6048321e35fa84b26ceb028b83e62cb80e457427e2d2105761afa0e70acd7&token=2138570566&lang=zh_CN#rd) * [一个脚本教你快速去除桌面图标烦人的小箭头!!](https://mp.weixin.qq.com/s?__biz=Mzg4MjU0OTM1OA==&mid=2247497936&idx=1&sn=783d8a11c155c0a0dcfd8654295138cb&chksm=cf564fd1f821c6c72cdef5edce37a29321730d69e686b30adab49278138832acbba06605494c&token=2138570566&lang=zh_CN#rd) * [这些网站陪伴了我5年!!](https://mp.weixin.qq.com/s?__biz=Mzg4MjU0OTM1OA==&mid=2247497781&idx=1&sn=8880e760dd07a6ac6f2992bf6ea9cfd1&chksm=cf564f34f821c62289fe696f7960903a2cc4ba445a658be00111fbaf2f5290e96ffb936c6a31&token=1170633251&lang=zh_CN#rd) * [气愤:特么原创文章被抄袭者举报抄袭了!](https://mp.weixin.qq.com/s?__biz=Mzg4MjU0OTM1OA==&mid=2247497737&idx=1&sn=8b02a91d1b6243cc788993d985a815fe&chksm=cf564f08f821c61e320c5d296e0cd92acfeacd4c0de1d7c2584e0f510f2839d66692d31def60&token=1170633251&lang=zh_CN#rd) * [冰河去开源中国了?](https://mp.weixin.qq.com/s?__biz=Mzg4MjU0OTM1OA==&mid=2247497684&idx=1&sn=b6eb0c56807346e726ecb0cfc3464200&chksm=cf5640d5f821c9c326a5c35c2f3ea581b302555a69459fc781a9ce88cb45c14e0bc8d78a9ae9&token=1170633251&lang=zh_CN#rd) * [医院,你好狠!!](https://mp.weixin.qq.com/s?__biz=Mzg4MjU0OTM1OA==&mid=2247497618&idx=1&sn=c4f00728f452dbdd57840c7c541d8d28&chksm=cf564093f821c985217a626f404656261e98f5a110b32503991eb129279c4e18501920d91426&token=1170633251&lang=zh_CN#rd) * [冰河又被全球敏捷运维峰会推荐了!](https://mp.weixin.qq.com/s?__biz=Mzg4MjU0OTM1OA==&mid=2247497081&idx=1&sn=0d94381c62d5daca51e1e4549d883c96&chksm=cf564278f821cb6e7ca96749658d1c235b9a1319b35f3a991efa47e76fbecd3f1dc26b6f212c&token=1170633251&lang=zh_CN#rd) * [这件事我坚持了9个月!!](https://mp.weixin.qq.com/s?__biz=Mzg4MjU0OTM1OA==&mid=2247496676&idx=1&sn=e3ee359da4ec9d8ff5cfb739c6ebfed5&chksm=cf5644e5f821cdf38fa76d6cc09ce922f11917021ba556caa2f5aee935f33d44fc308bf71f0a&token=627091476&lang=zh_CN#rd) * [跟小伙伴们聊聊我有趣的大学生活和我那两个好基友!](https://mp.weixin.qq.com/s?__biz=Mzg4MjU0OTM1OA==&mid=2247496575&idx=1&sn=219b048616242887b1b2f828e9f8e28b&chksm=cf56447ef821cd68265098f37714357bf69c80ca640efd7a5da7e31ceca7a7237fbc3c83d090&token=627091476&lang=zh_CN#rd) * [冰河又被腾讯推荐了!!](https://mp.weixin.qq.com/s?__biz=Mzg4MjU0OTM1OA==&mid=2247495809&idx=1&sn=a347b7aefd968eb0e53e8401ef8abdd2&chksm=cf564780f821ce9690cef93dfbca06d3bea192578188aec4dc1d2cf2996e5b140ff5cb3a093f&token=546222828&lang=zh_CN#rd) * [冰河,能跟大家分享下你的经历吗?](https://mp.weixin.qq.com/s?__biz=Mzg4MjU0OTM1OA==&mid=2247495593&idx=1&sn=d87489315dcb0c35671798926781f20d&chksm=cf5658a8f821d1bea6490629ca9903b0cf1ea7e7cddeb5ab5af5910a21e5f4c94deab2e0319a&token=546222828&lang=zh_CN#rd) * [冰河技术微信群开放啦!!](https://mp.weixin.qq.com/s?__biz=Mzg4MjU0OTM1OA==&mid=2247495127&idx=2&sn=50e126c1a5faab1ccc7f613a7876fb75&chksm=cf565ad6f821d3c0c6fee2478f804f1ceba0a838ea35ac9c51ec97649f78c0f13a73617f608e&token=1081687632&lang=zh_CN#rd) * [啥情况?很多读者竟然要到我QQ空间学习?](https://mp.weixin.qq.com/s?__biz=Mzg4MjU0OTM1OA==&mid=2247495045&idx=1&sn=b15660f9484fadc798dbc14324fcd1a7&chksm=cf565a84f821d39217609023aad0e16c2f31d362f4d1e974ae3064c7523258640c466e02c7e4&token=1081687632&lang=zh_CN#rd) * [交稿了!!](https://mp.weixin.qq.com/s?__biz=Mzg4MjU0OTM1OA==&mid=2247494720&idx=1&sn=cd78b6068f377aaf1dba7ca0bc5a946b&chksm=cf565b41f821d257b5a8dd86d582b1e102751bedf60c9ed4b511c4e77eb3ebae6d293caa6246&token=1081687632&lang=zh_CN#rd) * [冰河裸考竟然过了!!](https://mp.weixin.qq.com/s?__biz=Mzg4MjU0OTM1OA==&mid=2247494619&idx=1&sn=0c8a69516ecb86480d7c8f5b1e45bfa7&chksm=cf565cdaf821d5ccd5f048524f8d2d18e48edba2de61ee56aa3ac9a0afa6247a68599992aac6&token=1772864567&lang=zh_CN#rd) * [我在清华大学看到了自己出版的图书!!](https://mp.weixin.qq.com/s?__biz=Mzg4MjU0OTM1OA==&mid=2247494563&idx=1&sn=e4b23e47fbe128d77a0efe63f0dfe052&chksm=cf565ca2f821d5b4e49a940e839349efe76ca6c3ab50a8f9827d3c487923a1f2769cfc82ff50&token=1772864567&lang=zh_CN#rd) * [冰河上百度百科了!!](https://mp.weixin.qq.com/s?__biz=Mzg4MjU0OTM1OA==&mid=2247494544&idx=1&sn=872111e8ad2fb4240f7502e8211a72f4&chksm=cf565c91f821d587c3d683ce7c148987acc73aebe716e34195fab18fe6b1e9c15a2929066e13&token=1772864567&lang=zh_CN#rd) * [在图书馆看到自己出版的图书是一种怎样的体验?](https://mp.weixin.qq.com/s?__biz=Mzg4MjU0OTM1OA==&mid=2247494279&idx=1&sn=27fb28e4f62427a4d5f35b064419d9aa&chksm=cf565d86f821d4907c2deb2adaf7927736e0d515b74897253d58b8849870c2e7de34381fa09c&token=1772864567&lang=zh_CN#rd) * [生病了!](https://mp.weixin.qq.com/s?__biz=Mzg4MjU0OTM1OA==&mid=2247493983&idx=1&sn=401faebd5873ef02f8a5958121b04024&chksm=cf565e5ef821d74836c7458c62b67dbf714961f9fe07834825f0e037bf12873612b9f979a2c6&token=1929495520&lang=zh_CN#rd) * [想做一个平台,整合所有医院的资源!](https://mp.weixin.qq.com/s?__biz=Mzg4MjU0OTM1OA==&mid=2247493673&idx=1&sn=799ce630997d008c28517f302aa02417&chksm=cf565f28f821d63efeaa1431aacb9dbd0944dff2ebf4f7e02b5897c94f7f590a9be9ae4086be&token=1929495520&lang=zh_CN#rd) * [生病了,希望能早日康复!](https://mp.weixin.qq.com/s?__biz=Mzg4MjU0OTM1OA==&mid=2247493670&idx=1&sn=d07941b87378938d00c9d686ade7944a&chksm=cf565f27f821d63176bfc4103c8c7657c9400773325a9e54392e9959de51c319020599b7578e&token=1929495520&lang=zh_CN#rd) * [这本MySQL图书再次加印1万册!!](https://mp.weixin.qq.com/s?__biz=Mzg4MjU0OTM1OA==&mid=2247493585&idx=1&sn=93ef5010dcf9ad9ee77b12be947e25a0&chksm=cf5650d0f821d9c65f5b0bd8c33096f0bd4ebb10667d9da31cbbd04d6f9bf4eb33592ec70e49&token=1929495520&lang=zh_CN#rd) * [冰河创作《深入理解高并发编程》的幕后故事!](https://mp.weixin.qq.com/s?__biz=Mzg4MjU0OTM1OA==&mid=2247493250&idx=1&sn=f07e7e632730995f864c38f79174a235&chksm=cf565183f821d895ad6f09d8a1bbc192f8a19f6f1d4768f2b3b272252c22ca1c85bc7d2a7684&token=447597716&lang=zh_CN#rd) * [冰河常逛这些学习网站!!](https://mp.weixin.qq.com/s?__biz=Mzg4MjU0OTM1OA==&mid=2247493087&idx=1&sn=1c9a619471364701e059003ddb28d8ca&chksm=cf5652def821dbc8a193344c20a0d133926c7e765a565d9d4aef2ed5b37d184ee3d816877eef&token=1352513703&lang=zh_CN#rd) * [冰河这一个月都干啥啦?](https://mp.weixin.qq.com/s?__biz=Mzg4MjU0OTM1OA==&mid=2247493060&idx=1&sn=26c1ab30fe5dd021fc5f6e831694bceb&chksm=cf5652c5f821dbd3e8a415fa4e70f0b83dbff00bea9af5639c1244b78355956090522f6f119c&token=1352513703&lang=zh_CN#rd) * [五一假期偷偷学这些内容,然后惊艳所有人!](https://mp.weixin.qq.com/s?__biz=Mzg4MjU0OTM1OA==&mid=2247492954&idx=1&sn=b824f9fa6eb1e1c1683596458e4b9aba&chksm=cf56525bf821db4d2cf6ab008e20f8147c760bd70e41aca990c3b32a402ff3aedaf2c933fc82&token=1352513703&lang=zh_CN#rd) * [听说Seata发布新版本了?不慌,五一假期吃透它!!](https://mp.weixin.qq.com/s?__biz=Mzg4MjU0OTM1OA==&mid=2247492935&idx=1&sn=ab7dbf10265ccd35cc0483c3adb70b47&chksm=cf565246f821db50a805dd563849cd767656ec7e26405b9e857632ff2df5171aa39654a09a85&token=1352513703&lang=zh_CN#rd) * [很多小伙伴问我如何按分类查看「冰河技术」公号文章,往这儿瞧!!](https://mp.weixin.qq.com/s?__biz=Mzg4MjU0OTM1OA==&mid=2247492619&idx=1&sn=cee7a99c29a19ddcc4dc7b00bf813fcf&chksm=cf56530af821da1c7e760521c24e238db37ad07b798eed804b91e134f6006b5a3d72b6d5e171&token=78679959&lang=zh_CN#rd) * [毕业五年,从月薪3000到年薪百万,我掌握了哪些核心技能?](https://mp.weixin.qq.com/s?__biz=Mzg4MjU0OTM1OA==&mid=2247492405&idx=1&sn=008cf51e22709b37f40c1f7e6b20be17&chksm=cf565434f821dd2281b97986f4fc59024524ed2338c64f9ba4352a363755e09ffe3929cc46d9&token=78679959&lang=zh_CN#rd) * [文章又被白嫖了,你说气不气?](https://mp.weixin.qq.com/s?__biz=Mzg4MjU0OTM1OA==&mid=2247492392&idx=1&sn=6a5449a08106ab03c7423fcff41cea1a&chksm=cf565429f821dd3f44d818b7452e3ac3bb5a756b673e168bb3a30839fd6fee07496f271357db&token=1053291757&lang=zh_CN#rd) * [清明节偷偷训练“熊猫烧香”,结果我的电脑为熊猫“献身了”!](https://mp.weixin.qq.com/s?__biz=Mzg4MjU0OTM1OA==&mid=2247492376&idx=1&sn=9df725eb8b40dbb85359dbc25f013e35&chksm=cf565419f821dd0f0195f17d2662cf8c72ec81c1c3e0fda8954a1e5c159e9230c8d1118eba3a&token=1053291757&lang=zh_CN#rd) * [千万不要轻易尝试“熊猫烧香”,这不,我后悔了!](https://mp.weixin.qq.com/s?__biz=Mzg4MjU0OTM1OA==&mid=2247492295&idx=1&sn=4d1a4407977febf2e8c2e234f42e1c30&chksm=cf5655c6f821dcd0eb1f6945217cb29a8ab89dd4aaeb507da173d2a94aae514c40c4185fad7e&token=1053291757&lang=zh_CN#rd) * [从小白程序员到大厂高级技术专家我看过哪些书籍?](https://mp.weixin.qq.com/s?__biz=Mzg4MjU0OTM1OA==&mid=2247492259&idx=1&sn=067e2f48577c23eadae9922f9bda0cf0&chksm=cf5655a2f821dcb47322fa711ba6e4c21d9d10ec52ef2c8d7c6ca228e402aba2ffd6f25303c2&token=1053291757&lang=zh_CN#rd) * [敲定了,冰河要搞大事情了!](https://mp.weixin.qq.com/s?__biz=Mzg4MjU0OTM1OA==&mid=2247492020&idx=1&sn=66b8e7335c05067ad0187a45642c1884&chksm=cf5656b5f821dfa3437f652040bfb3efe51390773de8c1b9e2be1aab1985f3c7e5ac642faf0f&token=1093611389&lang=zh_CN#rd) * [连续一年多没有周末是一种怎样的体验?](https://mp.weixin.qq.com/s?__biz=Mzg4MjU0OTM1OA==&mid=2247491980&idx=1&sn=cde6f4556e8dbb76798fd4aa9ac8aeb8&chksm=cf56568df821df9b22456fcc9495431d6fda8e756ef1a08410cf425388e9bbbf2ebcf08e069c&token=1093611389&lang=zh_CN#rd) * [小伙伴问我:「冰河技术」公号如何查看文章分类?](https://mp.weixin.qq.com/s?__biz=Mzg4MjU0OTM1OA==&mid=2247491860&idx=1&sn=7a5692c74f88ac0e6e62cad958fe50b0&chksm=cf565615f821df0321278f83d50e1bbb961e2b5a3e2b9945219d4ded56b7aa7d73f8bb840402&token=1093611389&lang=zh_CN#rd) * [新年新气象,2021年我们一起携手前行!!](https://mp.weixin.qq.com/s?__biz=Mzg4MjU0OTM1OA==&mid=2247489715&idx=1&sn=f6153571a149be56ee5c9242eccc2c35&chksm=cf55afb2f82226a4ac7101378220b60528015989e1c805b59575f982d993e9556e411c9d42df&token=776659970&lang=zh_CN#rd) * [元旦在家撸了两天Seata源码,你们是咋度过的呢?](https://mp.weixin.qq.com/s?__biz=Mzg4MjU0OTM1OA==&mid=2247489729&idx=1&sn=64c1c1089a8ff2103673042c7ddf0d3c&chksm=cf55afc0f82226d6e5d54724349d358c7298a9bacc57d0419127656c443eb6c78678428d20cb&token=776659970&lang=zh_CN#rd) * [完了,这货迟早要超过我!我该怎么办?](https://mp.weixin.qq.com/s?__biz=Mzg4MjU0OTM1OA==&mid=2247490414&idx=1&sn=1bf8eec5825d0a1299e32e643d8e0eda&chksm=cf55ac6ff8222579bace9de611cb34e258664fecb27420d41def225d990a61fc6f464cd73cfb&token=776659970&lang=zh_CN#rd) * [冰河去腾讯了?](https://mp.weixin.qq.com/s?__biz=Mzg4MjU0OTM1OA==&mid=2247490481&idx=1&sn=d0e39c092165536f165ee5ceced37e2a&chksm=cf55acb0f82225a6d35f227e3b26d409a4b30683fb6531b3cad5e4c50c133ec3f79e8f0a1a9b&token=776659970&lang=zh_CN#rd) * [满满的干货,我不信你能看完!](https://mp.weixin.qq.com/s?__biz=Mzg4MjU0OTM1OA==&mid=2247490629&idx=1&sn=8fbbefd2da5f15e0e0827d2d50292385&chksm=cf55ab44f8222252c89aa414e75a818e370c0642eb3bd2eec03b6eefefc8b4dabedf9d156724&token=776659970&lang=zh_CN#rd) * [团年啦](https://mp.weixin.qq.com/s?__biz=Mzg4MjU0OTM1OA==&mid=2247490753&idx=1&sn=0bbe4a255b687f88a5579635274e920a&chksm=cf55abc0f82222d6e56e724be444d354a6e6a4096b3c02bcf955d3d07e7595ad942db396496b&token=776659970&lang=zh_CN#rd) * [假期无聊冰河开发了一款国民级游戏!](https://mp.weixin.qq.com/s?__biz=Mzg4MjU0OTM1OA==&mid=2247490774&idx=1&sn=29f0db1cd78da44aa9e3b5c31a7d0b0a&chksm=cf55abd7f82222c1fa4a8637d1d789ddcafcac61ce789d249904cadac2d025c4b9ffdaa50829&token=776659970&lang=zh_CN#rd) * [过完年,该减肥了(说的不是我)!](https://mp.weixin.qq.com/s?__biz=Mzg4MjU0OTM1OA==&mid=2247490777&idx=1&sn=28665e6a5c1624e4d951f047b4839259&chksm=cf55abd8f82222ce7cb2ea80817d97c6312df955fa01fe5b9fe2e3d316a0f19603235b09118d&token=776659970&lang=zh_CN#rd) * [程序员究竟要不要读源码?!P8级别架构师带你一起辩证分析!](https://mp.weixin.qq.com/s?__biz=Mzg4MjU0OTM1OA==&mid=2247489400&idx=1&sn=4c86fb3474544a16c7853a0ad076fdc8&chksm=cf55a079f822296f938ce17fc191456e162dbb8e051b5b35f8b32326f73b42e66f885265ba29&token=776659970&lang=zh_CN#rd) * [很多小伙伴说想看我的真容,这次满足你的好奇心!!](https://mp.weixin.qq.com/s?__biz=Mzg4MjU0OTM1OA==&mid=2247489223&idx=1&sn=1c7b1bce3ef9297f88bd77ba709cd59a&chksm=cf55a1c6f82228d041b09b2c6f778f93f6b32f05abf8984519d59e88acb7304f11e7addd89c4&token=776659970&lang=zh_CN#rd) * [跟小伙伴们聊聊我有趣的大学生活和我那两个好基友!](https://mp.weixin.qq.com/s?__biz=Mzg4MjU0OTM1OA==&mid=2247489213&idx=1&sn=48f7634975463aa8e966afc0a259f589&chksm=cf55a1bcf82228aa45809a62b7d84dc8078c198082f18470fe4ed76c77f91330d61225db0841&token=776659970&lang=zh_CN#rd) * [下班咯~老地方见!](https://mp.weixin.qq.com/s?__biz=Mzg4MjU0OTM1OA==&mid=2247489203&idx=1&sn=ba66069fb78dd1ffe8fe03a3d1a73d21&chksm=cf55a1b2f82228a4983c48868a504351d5c8b2b5772dfe53d2dec9333a2b184a2d89ca4a730c&token=776659970&lang=zh_CN#rd) * [大家国庆中秋准备怎么过呀?](https://mp.weixin.qq.com/s?__biz=Mzg4MjU0OTM1OA==&mid=2247489171&idx=1&sn=9ca9e36e3fbdd407c9b1748511189244&chksm=cf55a192f822288410140921805b5cce0f059ec6e00c7942bfdfbb9f465f47a2860b5d38b8d3&token=776659970&lang=zh_CN#rd) * [接下来,冰河要有大动作了!](https://mp.weixin.qq.com/s?__biz=Mzg4MjU0OTM1OA==&mid=2247489095&idx=1&sn=7c36585b322424d3da826f815181d04c&chksm=cf55a146f8222850ab8b7503937331d2edd81f9e119228aa68eb5f374ba4a8b49c4ca092cf03&token=776659970&lang=zh_CN#rd) * [致读者:关于冰河技术微信公众号的重大调整!!](https://mp.weixin.qq.com/s?__biz=Mzg4MjU0OTM1OA==&mid=2247489093&idx=1&sn=324f2847939fc2a45eab802113fc0b9c&chksm=cf55a144f82228522acebae4887df472b413fc2d1c558175d041d8295d68202a7885079eaf9f&token=776659970&lang=zh_CN#rd) * [冰河技术微信公众号终于可以留言了(文末领福利)!!](https://mp.weixin.qq.com/s?__biz=Mzg4MjU0OTM1OA==&mid=2247489478&idx=1&sn=b06a4ac9282679fe601170a85a739fdc&chksm=cf55a0c7f82229d11b3b0b196793647422a774296bff91891604f4ed37219d6b7a0dcdbffa94&token=776659970&lang=zh_CN#rd) * [毕业三年,如何达到月薪30K?我想跟你聊聊!!](https://mp.weixin.qq.com/s?__biz=Mzg4MjU0OTM1OA==&mid=2247489631&idx=1&sn=755cff80d63168634464666cffc74219&chksm=cf55af5ef8222648b65e1b9963d799b4fcf8563cc0c962bcaa06c34cfb2016a9167ef1d15b82&token=776659970&lang=zh_CN#rd) * [冰河是谁?到底是干啥的?](https://mp.weixin.qq.com/s?__biz=Mzg4MjU0OTM1OA==&mid=2247489644&idx=1&sn=f77943df3a6f2cc33974f83d7f536627&chksm=cf55af6df822267bbe9ed604c59db4cd71b004c489bc58d7441b3066455450ce5b066113186a&token=776659970&lang=zh_CN#rd) * [总结2020:5个月出版两本书,日更公众号是一种怎样的体验?](https://mp.weixin.qq.com/s?__biz=Mzg4MjU0OTM1OA==&mid=2247489703&idx=1&sn=5fb966a0269ff9728e827ea4ecb0fb67&chksm=cf55afa6f82226b064a19b7bcbb39b6e9b0d7dc4fbc6b0fdc344206ac25eeb80d14382986a7e&token=776659970&lang=zh_CN#rd) ### 视频系列 * [2分钟完美安装Docker,学不会你捶我!!](https://mp.weixin.qq.com/s?__biz=Mzg4MjU0OTM1OA==&mid=2247494076&idx=1&sn=2502fc281ef2b42e7aca42ddfd72136b&chksm=cf565ebdf821d7abaa5079600af6f60d1223eec4d9375b634bdee59574381ac97591ef5d7fbc&token=1929495520&lang=zh_CN#rd) * [冰河开始“拍片儿”了?](https://mp.weixin.qq.com/s?__biz=Mzg4MjU0OTM1OA==&mid=2247494009&idx=1&sn=e401cf9a39ce8106df0a6d6cef701f6f&chksm=cf565e78f821d76ee8f32204cb7681d90410ab2c8962e5f0299b23d222d0a45e80f11ab14b48&token=1929495520&lang=zh_CN#rd) * [看病排队大半天,诊断2分钟!](https://mp.weixin.qq.com/s?__biz=Mzg4MjU0OTM1OA==&mid=2247493675&idx=1&sn=de33bb8ab2029a992f398af154760264&chksm=cf565f2af821d63ce322648dcf74baeb8b9c8f9bad20cbbe78ef6c153abcb86e12b78544fcd4&token=1929495520&lang=zh_CN#rd) * [2分钟完美手动安装Docker环境!](https://mp.weixin.qq.com/s?__biz=Mzg4MjU0OTM1OA==&mid=2247491959&idx=1&sn=37f033cc90378390d140f12de03c1ad8&chksm=cf565676f821df6033b4bd1632f93ee1ed7564a2ff473dacd0b76f1e594bed5e0bd2b963ec92&token=1093611389&lang=zh_CN#rd) * [春节喜气洋洋,小区给大家发红包,我也沾沾喜气!](https://mp.weixin.qq.com/s?__biz=Mzg4MjU0OTM1OA==&mid=2247490755&idx=1&sn=a57491f87164366ada53577f4218078f&chksm=cf55abc2f82222d406c09d4f881d8f83ff9e78db364cf770f1633e69baa00e937be7830fce9c&token=776659970&lang=zh_CN#rd) * [AI正在深刻改变人们的生活!](https://mp.weixin.qq.com/s?__biz=Mzg4MjU0OTM1OA==&mid=2247490768&idx=1&sn=932d696c84b4d30358a7c4a6696dcd30&chksm=cf55abd1f82222c7c2763207783d80b06da060dc6c8babb77838bdd02565bf88c8c057f64a86&token=776659970&lang=zh_CN#rd) * [冰河假期去哪啦?](https://mp.weixin.qq.com/s?__biz=Mzg4MjU0OTM1OA==&mid=2247489644&idx=1&sn=f77943df3a6f2cc33974f83d7f536627&chksm=cf55af6df822267bbe9ed604c59db4cd71b004c489bc58d7441b3066455450ce5b066113186a&token=776659970&lang=zh_CN#rd) * [在服务器命令行输入 rm -rf /* 到底会发生什么?](https://mp.weixin.qq.com/s?__biz=Mzg4MjU0OTM1OA==&mid=2247490883&idx=1&sn=f90d57b6cb74e138b56106df10b597b9&chksm=cf55aa42f8222354e09e0fed24762d7e29f9965d5fa01412c6fa23924a54f65f061afedcfbac&token=776659970&lang=zh_CN#rd) ### 小工具 * [IDEA设置System.out.println()和main方法快捷键](https://mp.weixin.qq.com/s?__biz=Mzg4MjU0OTM1OA==&mid=2247489466&idx=1&sn=4cd45f2f6c4599148ea148c6a69d2ff2&chksm=cf55a0bbf82229ade4688bd216f0f7229cf6ec236d67f2480d8178fe4ab2bed278b2fd2b3a84&token=776659970&lang=zh_CN#rd) * [Java VisualVM添加Visual GC插件实现JVM性能调优](https://mp.weixin.qq.com/s?__biz=Mzg4MjU0OTM1OA==&mid=2247489461&idx=1&sn=d54fac64e91a459bc0f286beb7c500cb&chksm=cf55a0b4f82229a229aad37fef5c4c942bdee9db141ea91045e908b7a55b9b27ee32d50f0770&token=776659970&lang=zh_CN#rd) * [原来Java是这样实现微信小程序加密与解密数据算法的!超赞的有木有?!](https://mp.weixin.qq.com/s?__biz=Mzg4MjU0OTM1OA==&mid=2247489452&idx=1&sn=0325302bf29eab9b048eb3016f612d1d&chksm=cf55a0adf82229bb3e1e88f584f3e5224aea0303963c700d54f0936a01c5e00ac1b7b17d0b30&token=776659970&lang=zh_CN#rd) ### Python专题 * [智能化时代你还在手动刷新浏览器抢火车票?那你就真的Out了!假期快来了,冰河提前放大招,教你如何使用Python脚本自动抢火车票](https://mp.weixin.qq.com/s?__biz=Mzg4MjU0OTM1OA==&mid=2247489453&idx=1&sn=ebf0cc517519c5b088bbacabf1cdcfb2&chksm=cf55a0acf82229bada0f39ba67550b93622bfdd4cd9bab605fd8d14aab11f141a574e31fd890&token=776659970&lang=zh_CN#rd) ## 重磅福利 微信搜一搜【冰河技术】微信公众号,关注这个有深度的程序员,每天阅读超硬核技术干货,公众号内回复【PDF】有我准备的一线大厂面试资料和我原创的超硬核PDF技术文档,以及我为大家精心准备的多套简历模板(不断更新中),希望大家都能找到心仪的工作,学习是一条时而郁郁寡欢,时而开怀大笑的路,加油。如果你通过努力成功进入到了心仪的公司,一定不要懈怠放松,职场成长和新技术学习一样,不进则退。如果有幸我们江湖再见! 另外,我开源的各个PDF,后续我都会持续更新和维护,感谢大家长期以来对冰河的支持!! 回复【并发编程2】获取冰河最新整理的《深入理解高并发编程(第2版)》电子书。 回复【并发编程】获取全网累计下载60W+的《深入理解高并发编程》电子书。 回复【渗透笔记】获取全网5星好评的《冰河的渗透实战笔记》电子书。 ![](images/gongzhonghao.png)
0x4D31/detection-and-response-pipeline
https://github.com/0x4D31/detection-and-response-pipeline
✨ A compilation of suggested tools/services for each component in a detection and response pipeline, along with real-world examples. The purpose is to create a reference hub for designing effective threat detection and response pipelines. 👷 🏗
# Detection and Response Pipeline ✨ A compilation of suggested tools for each component in a detection and response pipeline, along with real-world examples. The purpose is to create a reference hub for designing effective threat detection and response pipelines. 👷 🏗 Join us, explore the curated content, and contribute to this collaborative effort. ## Contents Main Components of a Detection & Response Pipeline: 0. 📦 [Detection-as-Code Pipeline](#detection-as-code-pipeline) 1. 🪵 [Data Pipeline](#data-pipeline) 2. ⚠️ [Detection and Correlation Engine](#detection-and-correlation-engine) 3. ⚙️ [Response Orchestration and Automation](#response-orchestration-and-automation) 4. 🔍 [Investigation and Case Management](#investigation-and-case-management) 💡 [Real-world Examples](#real-world-examples) 📑 [Additional Resources](#additional-resources) ## Detection-as-Code Pipeline | Tool / Service | Purpose | |---|---| | GitHub | Detection content development | | GitLab | Detection content development | | Gitea | Detection content development | | AWS CodeCommit | Detection content development | | GitHub Actions | CI/CD pipeline | | GitLab Runner | CI/CD pipeline | | Drone | CI/CD pipeline | | AWS CodePipeline | CI/CD pipeline | ### Resources - [Automating Detection-as-Code](https://www.tines.com/blog/automating-detection-as-code): An example reference that uses GitHub for detection content development, GitHub Actions for CI/CD, Elastic as SIEM, GitHub Issues for alert management, and Tines for alert and response handling. - [Practical Detection-as-Code](https://www.infosecb.com/threat/detection/2021/11/21/detection-as-code.html): An example Detection-as-Code pipeline implementation using Sigma rules, GitLab CI/CD, and Splunk. - CI/CD Detection Engineering ([part 1](https://www.splunk.com/en_us/blog/security/ci-cd-detection-engineering-splunk-security-content-part-1.html), [part 2](https://www.splunk.com/en_us/blog/security/ci-cd-detection-engineering-splunk-s-attack-range-part-2.html), [part 3](https://www.splunk.com/en_us/blog/security/ci-cd-detection-engineering-failing-part-3.html), [part 4](https://www.splunk.com/en_us/blog/security/ci-cd-detection-engineering-dockerizing-for-scale-part-4.html)): An example CI/CD detection engineering workflow in a Splunk environment. - From soup to nuts: Building a Detection-as-Code pipeline ([Part 1](https://medium.com/threatpunter/from-soup-to-nuts-building-a-detection-as-code-pipeline-28945015fc38), [Part 2](https://medium.com/threatpunter/from-soup-to-nuts-building-a-detection-as-code-pipeline-59b96c875ca1)) ## Data Pipeline | Tool / Service | Purpose | Deployment | |---|---|---| | [Substation](https://github.com/brexhq/substation) | Data movement and transformation | Self-hosted (Open Source) | | [Vector](https://github.com/vectordotdev/vector) | Data movement and transformation | Self-hosted (Open Source) | | [Tenzir](https://github.com/tenzir/tenzir) | Data movement and transformation | Self-hosted (Open Source) | | [Fluent Bit](https://github.com/fluent/fluent-bit) | Data movement and transformation | Self-hosted (Open Source) | | [Logstash](https://github.com/elastic/logstash) | Data movement and transformation | Self-hosted (Open Source) | | [Airbyte](https://github.com/airbytehq/airbyte) | Data movement and transformation | Self-hosted (Open Source) and Cloud | | [Cribl Stream](https://cribl.io/) | Data movement and transformation | Self-hosted (Free), Hybrid and Cloud | | [Tarsal](https://tarsal.co) | Data movement and transformation | Cloud | | [Kafka](https://kafka.apache.org/) | Stream processing | Self-hosted (Open source) and Cloud ([Confluent](https://www.confluent.io/)) | | [Amazon Kinesis Data Streams](https://aws.amazon.com/kinesis/data-streams) | Stream processing | Cloud | | [Apache Spark](https://spark.apache.org/) | Stream and batch processing | Self-hosted (Open source) | | [Databricks](https://www.databricks.com/) | Stream and batch processing | Cloud | | [Google Cloud DataFlow](https://cloud.google.com/dataflow) | Stream and batch processing | Cloud | | [Apache Flink](https://flink.apache.org/) | Stream and batch processing | Self-hosted (Open source) | | [Apache NiFi](https://nifi.apache.org/) | Stream and batch processing | Self-hosted (Open source) | | [Apache Beam](https://beam.apache.org/) | Stream and batch processing | Open source; Self-hosted or cloud-based runner | | [Faust](https://faust.readthedocs.io/en/latest/) | Stream and batch processing | Self-hosted (Open source) | ## Detection and Correlation Engine In addition to the stream and batch processing tools mentioned in the [data pipeline](#data-pipeline) section, the following tools can be used for data analysis and detection. | Tool / Service | Description | |---|---| | [Elasticsearch](https://github.com/elastic/elasticsearch) | with [ElastAlert2](https://github.com/jertel/elastalert2) or Elastic $ecurity | | [OpenSearch](https://github.com/opensearch-project/OpenSearch) | with [ElastAlert2](https://github.com/jertel/elastalert2) or [OpenSearch Alerting](https://github.com/opensearch-project/alerting) | | [Amazon Kinesis Data Analytics](https://aws.amazon.com/kinesis/data-analytics/) | Streaming data analysis in real time using Apache Flink | | [Matano](https://github.com/matanolabs/matano) | Open source security lake platform for AWS | | [ksqlDB](https://github.com/confluentinc/ksql) | SQL-Based Streaming for Kafka | | [StreamAlert](https://github.com/airbnb/streamalert) | Real-time data analysis and alerting framework | ## Response Orchestration and Automation | Tool / Service | Description | |---|---| | [Shuffler](https://github.com/Shuffle/Shuffle) | A general purpose security automation platform | | [Torq](https://www.torq.io/) | No-code hyperautomation for security workflows | | [Tines](https://www.tines.com/) | No-code automation for security workflows | ## Investigation and Case Management | Tool / Service | Description | Deployment | |---|---|---| | [DFIR IRIS](https://dfir-iris.org) | Open-Source Collaborative Incident Response Platform | Self-hosted (Open source) | | [TheHive](https://github.com/TheHive-Project/TheHive) | Open Source and Free Security Incident Response Platform | Self-hosted (Open source) | | [GitHub](https://www.github.com) | GitHub issues can be used for case management. Check out the video in the Resources section. | Cloud | | [Jira Service Management](https://www.atlassian.com/software/jira/service-management/features/itsm#incident-management) | IT service management platform with incident management features | Cloud | | [Tines Cases](https://www.tines.com/product/cases) | | Cloud | | [Torq Case Management](https://torq.io/blog/hyperautomation-case-management/) | | Cloud | ### Resources: - [Using Github For Automated Case Management](https://www.youtube.com/watch?v=PqMI6euJS7s): Material Security's presentation on how its security team is automating case management using GitHub and Tines. ## Real-world Examples Please note that this information is extracted from public blog posts and conference talks, and may not be comprehensive or reflect the current state of the companies' pipelines. Some examples may focus on specific components, such as the correlation engine, rather than covering the entire pipeline. These examples are intended as starting points, so please view them as informative rather than definitive solutions. If you have additional information or insights about any of the examples included here and **have permission to share** them, we encourage you to contribute by sending a pull request to enhance or add more details. | # | Technologies / Components | Note | References | |---|---|---|---| | 0 | • Databricks <br>• Apache Spark <br>• Delta Lake <br>• Scala | _"Apple must detect a wide variety of security threats, and rises to the challenge using Apache Spark across a diverse pool of telemetry. Some of the home-grown solutions we’ve built to address complications of scale: <br>1. **Notebook-based testing CI** – Previously we had a hybrid development model for Structured Streaming jobs wherein most code would be written and tested inside of notebooks, but unit tests required export of the notebook into a user’s IDE along with JSON sample files to be executed by a local SparkSession. We’ve deployed a novel CI solution leveraging the Databricks Jobs API that executes the notebooks on a real cluster using sample files in DBFS. When coupled with our new test-generation library, we’ve seen 2/3 reduction in the amount of time required for testing and 85% less LoC. <br>2. **Self-Tuning Alerts** – Apple has a team of security analysts triaging the alerts generated by our detection rules. They annotate them as either ‘False Positive’ or ‘True Positive’ following the results of their analysis. We’ve incorporated this feedback into our Structured Streaming pipeline, so the system automatically learns from consensus and adjusts future behavior. This helps us amplify the signal from the rest of the noise. <br>3. **Automated Investigations** – There are some standard questions an analyst might ask when triaging an alert, like: what does this system usually do, where is it, and who uses it? Using ODBC and the Workspace API, we’ve been able to templatize many investigations and in some cases automate the entire process up to and including incident containment. <br>4. **DetectionKit** – We’ve written a custom SDK to formalize the configuration and testing of jobs, including some interesting features such as modular pre/post processor transform functions, and a stream-compatible exclusion mechanism using foreach Batch."_ | 1. [Scaling Security Threat Detection with Apache Spark and Databricks](https://www.youtube.com/watch?v=YxTE4mff5dk) by Josh Gillner (Apple Detection Engineering) <br> 2. [Threat Detection and Response at Scale](https://www.youtube.com/watch?v=SFeBJxI4Q98) by Dominque Brezinski (Apple) | | 1 | • Kafka <br>• Apache Spark <br>• Apache Hive <br>• Elasticsearch<br>• GraphQL<br>• Amazon S3<br>• Slack<br>• PagerDuty | ![](img/Netflix_socless.png) | [A SOCless Detection Team at Netflix](https://www.linkedin.com/pulse/socless-detection-team-netflix-alex-maestretti/) by Alex Maestretti (Netflix) | | 2 | • Kafka <br>• [Apache Samza](https://samza.apache.org/) <br>• Microsoft Sentinel? <br>• [KQL](https://learn.microsoft.com/en-us/azure/data-explorer/kusto/query/) <br>• Azure Pipelines and Repos for CI/CD pipeline <br>• Jira <br>• ServiceNow <br>• Serverless functions | *high-level strategy* <br>![](img/Linkedin-high-level-strategy.png) <br> *Simplified data collection pipeline* <br>![](img/Linkedin-simplified-data-pipelines.png) | [(Re)building Threat Detection and Incident Response at LinkedIn](https://engineering.linkedin.com/blog/2022/-re-building-threat-detection-and-incident-response-at-linkedin) by Sagar Shah and Jeff Bollinger (Linkedin) | | 3 | • [go-audit](https://github.com/slackhq/go-audit) <br>• Elasticsearch <br>• [ElastAlert](https://github.com/Yelp/elastalert)[0] | _"We send the events to an Elasticsearch cluster. From there we use ElastAlert to query our incoming data continuously for alert generation and general monitoring."_ | [Syscall Auditing at Scale](https://slack.engineering/syscall-auditing-at-scale/) by Ryan Huber (Slack) | | 4 | • Kafka <br> • Jupyter notebook <br> • Python <br> • osquery, Santa, and OpenBSM/Audit for MacOS monitoring | _"Alertbox was the first project we built to start cutting down on our triage time. The goal was to move our alert response runbooks into code, and have them execute before we even begin the triage process. <br> Think of Forerunner as the glue between Alertbox and Covenant. When an alert fires, Alertbox calls out a RPC service called Forerunner. This service returns a Jupyter notebook corresponding to the alert. Alertbox then embeds the URL of this Jupyter notebook into the alert ticket. In the background, Forerunner also runs this alert notebook asynchronously."_ <br> ![](img/Dropbox_pipeline.png) | 1. [How Dropbox Security builds tools for threat detection and incident response](https://dropbox.tech/security/how-dropbox-security-builds-better-tools-for-threat-detection-and-incident-response) by Dropbox DART <br> 2. [MacOS monitoring the open source way](https://dropbox.tech/security/macos-monitoring-the-open-source-way) by Michael George (Dropbox) <br> 3. [OLD] [Meet Securitybot: Open Sourcing Automated Security at Scale](https://dropbox.tech/security/meet-securitybot-open-sourcing-automated-security-at-scale) by Alex Bertsch (Dropbox) and [Distributed Security Alerting](https://slack.engineering/distributed-security-alerting/) by Ryan Huber (Slack) | | 5 | • [StreamAlert](https://github.com/airbnb/streamalert)<br>• [BinaryAlert](https://github.com/airbnb/binaryalert) | - _"StreamAlert is a serverless, real-time data analysis framework which empowers you to ingest, analyze, and alert on data from any environment, using data sources and alerting logic you define. Computer security teams use StreamAlert to scan terabytes of log data every day for incident detection and response."_<br>- _"BinaryAlert is an open-source serverless AWS pipeline where any file uploaded to an S3 bucket is immediately scanned with a configurable set of YARA rules. An alert will fire as soon as any match is found, giving an incident response team the ability to quickly contain the threat before it spreads."_ | 1. [StreamAlert: Real-time Data Analysis and Alerting](https://medium.com/airbnb-engineering/streamalert-real-time-data-analysis-and-alerting-e8619e3e5043) by Airbnb Eng <br> 2. [BinaryAlert: Real-time Serverless Malware Detection](https://medium.com/airbnb-engineering/binaryalert-real-time-serverless-malware-detection-ca44370c1b90) by Austin Byers (Airbnb) | 6 | • ELK stack <br> • Kafka <br> • [KSQL](https://github.com/confluentinc/ksql) <br> • [ES-Hadoop](https://github.com/elastic/elasticsearch-hadoop) <br> • ElastAlert[0] <br> • Apache Spark <br> • Jupyter notebook <br> • GraphFrames | _"The Hunting ELK or simply the HELK is one of the first open source hunt platforms with advanced analytics capabilities such as SQL declarative language, graphing, structured streaming, and even machine learning via Jupyter notebooks and Apache Spark over an ELK stack. This project was developed primarily for research, but due to its flexible design and core components, it can be deployed in larger environments with the right configurations and scalable infrastructure."_ <br> ![](img/HELK_design.png) | [The Hunting ELK](https://github.com/Cyb3rWard0g/HELK) project by Roberto Rodriguez | | 7 | • AWS Kinesis Firehose <br> • AWS Kinesis Data Analytics Application <br> • AWS Lambda <br> • AWS S3 <br> • AWS Athena <br> • AWS Simple Notification Services | _"In this example, various AWS serverless application services are used together to create a detection pipeline that is capable of near-realtime detection. The pipeline requires no administrative overhead of servers or container infrastructure, enabling a detection and response team to focus on threat detection capabilities."_ <br> ![](img/AWSServerlessArchitecture.png) | [Building a Serverless Detection Platform in AWS Pt. I: Endpoint Detection](https://infosecb.medium.com/building-a-serverless-detection-platform-in-aws-pt-i-endpoint-detection-1a0e34d9c28c) by Brendan Chamberlain | 0. ElastAlert is no longer maintained. You can use [ElastAlert2](https://github.com/jertel/elastalert2) instead. ## Additional Resources - [Awesome Threat Detection and Hunting](https://github.com/0x4D31/awesome-threat-detection): A curated list of awesome threat detection and hunting resources. - [Awesome Detection Engineering](https://github.com/infosecB/awesome-detection-engineering): A list of useful Detection Engineering-related resources. ## License [![CC0](http://mirrors.creativecommons.org/presskit/buttons/88x31/svg/cc-zero.svg)](http://creativecommons.org/publicdomain/zero/1.0) To the extent possible under law, Adel "0x4D31" Karimi has waived all copyright and related or neighboring rights to this work.
xl-alt/free_gpt4
https://github.com/xl-alt/free_gpt4
国内 镜像网站 免费提供api支持,仅供个人使用,
## GPT-4-free-api ### 简介 > ❗️免费API Key仅可用于个人非商业用途,教育,非营利性科研工作中。严禁商用,严禁大规模训练商用模型! > 该网站和api高度重视隐私,致力于保护其用户的隐私。该项目不会以任何方式收集、记录或存储用户输入的任何文本或由 OpenAI 服务器返回的任何文本。该项目不会向 OpenAI 或任何第三方提供有关 API 调用者的身份的任何信息,所有文本消息 均保存在本地。 但OpenAI官方会根据其[数据使用政策](https://platform.openai.com/docs/data-usage-policies)保留 30 天的数据。 #### 👏广告 [国内镜像网站](https://docs.chatanywhere.com.cn/) 支持GPT4.0, DALLE, Claude, Azure Openai, sd等集成部署。详情查看 [https://wxblog.xyz](https://wxblog.xyz) #### apikey 获取 **apikey 获取地址:[https://wxblog.xyz](https://wxblog.xyz) 登陆后前往个人中心查看key,加入QQ群 743876738,接口有问题会及时在群内通知哦!!** **api 分为付费版和免费版,付费版稳定性较好** 付费版支持模型包括【"gpt-3.5-turbo-0301","gpt-3.5-turbo","gpt-3.5-turbo-0613","gpt-3.5-turbo-16k","gpt-3.5-turbo-16k-0613","gpt-4","gpt-4-0613","claude-instant-100k","claude-instant"】 免费版支持模型包括【"`gpt-4`","`gpt-4-0613","gpt-3.5"】 ### api 请求地址及参数 #### 付费版 请求地址:`https://wxblog.xyz/api/blog/v1/chat/completions` 请求headers: `Authorization: Bearer YOURKEY` 参数说明: ```JavaScript { "messages": [ { "role": "user", "content": "你是谁" } ], // messages 必填 "stream": true, // 是否流式输出 "model": "gpt-4-0613", // 选择模型必填 "temperature": 0.5, // 设置AI思维扩展度 "presence_penalty": 2 } ``` #### 免费版 请求地址:`https://wxblog.xyz/api/blog/free/v1/chat/completions` 请求headers: `Authorization: Bearer YOURKEY` 参数说明: ```JavaScript { "messages": [ { "role": "user", "content": "你是谁" } ], // messages 必填 "stream": true, // 是否流式输出 "model": "gpt-4-0613", // 选择模型必填 "temperature": 0.5, // 设置AI思维扩展度 "presence_penalty": 2 } ``` #### api优点 1. 付费与免费版均支持 4.0 可供使用 2. 与官方完全一致的接口标准(免费版仅输出文本,不能设置system),兼容各种软件/插件。 3. 支持流式响应。 4. 国内线路使用动态加速. 5. 无需科学上网,国内环境直接可用。 6. 个人完全免费使用。 #### 如何使用 国内稳定站点:[https://www.openapi.technology/](https://www.openapi.technology/) 在设置中将 `接口地址` 更改为上诉请求地址(tips: 地址中 /v1/chat/completions 请勿粘贴上去`) 在apikey处 填写您的apikey即可使用 #### 国内镜像网站 [https://wxblog.xyz](https://wxblog.xyz) 此网站可以 直接使用无需翻墙,付费版api使用只需要在此网站升级即可,有问题可以加QQ群留言。我们会随时会接口进行维护,确保接口的稳定性。
imadeanaccount1/wasteof.plus
https://github.com/imadeanaccount1/wasteof.plus
WIP browser extension that adds cool, customizable addons to the wasteof.money social media. ⭐
# wasteof.plus <!-- ALL-CONTRIBUTORS-BADGE:START - Do not remove or modify this section --> [![All Contributors](https://img.shields.io/badge/all_contributors-3-orange.svg?style=flat-square)](#contributors-) <!-- ALL-CONTRIBUTORS-BADGE:END --> [![](https://img.shields.io/github/stars/imadeanaccount1/wasteof.plus?color=blue&style=flat-square)](https://github.com/imadeanaccount1/wasteof.plus/stargazers) [![](https://img.shields.io/github/forks/imadeanaccount1/wasteof.plus?color=blue&style=flat-square)](https://github.com/imadeanaccount1/wasteof.plus/network/members) [![](https://img.shields.io/github/watchers/imadeanaccount1/wasteof.plus?color=blue&style=flat-square)](https://github.com/imadeanaccount1/wasteof.plus/watchers) [![](https://img.shields.io/github/issues/imadeanaccount1/wasteof.plus?color=green&style=flat-square)](https://github.com/imadeanaccount1/wasteof.plus/issues) [![](https://img.shields.io/github/issues-pr/imadeanaccount1/wasteof.plus?color=green&style=flat-square)](https://github.com/imadeanaccount1/wasteof.plus/pulls) [![](https://img.shields.io/github/license/imadeanaccount1/wasteof.plus?style=flat-square)](https://github.com/imadeanaccount1/wasteof.plus/blob/master/LICENSE) [![](https://img.shields.io/badge/website-wasteofplus.github.io-ff7b26.svg?style=flat-square&color=blue)](https://wasteofplus.github.io) wasteof.plus is a browser extension that adds customizable addons to the wasteof.money social media. 🌠 ## Getting Started: Currently, wasteof.plus is not available on the Chrome web store, Edge add-on store, or the Firefox add-on marketplace. You must install it manually using one the following instructions: ### Browser Support | Chrome | Edge | Firefox | Opera | | -------- | ------ | --------- | -------- | | ✅ Tested, Working | ✅ Tested, Working | ❌ Not Tested | ✅ Tested, Working | #### For Chrome, Edge, and Opera: - Go to the [Releases tab](https://github.com/imadeanaccount1/wasteof.plus/releases) and download the latest release for your browser platform. Edge, Chrome, and Opera users should download releases marked as "chromium" while FireFox users should use "firefox" releases. - Extract the Zip file on your device. - Go to chrome://extensions (or opera://extensions, edge://extensions) in a new browser tab and toggle developer mode so that it's enabled. - Click "Load Unpacked" and select the the extracted folder. - There you go! You've successfully installed wasteof.plus. To test it out, try going to [wasteof.money](https://wasteof.money). #### Firefox wasteof.plus is still being ported to work in Firefox. You can follow development progress [here](https://github.com/imadeanaccount1/wasteof.plus/pull/32). ## Features: - Adds beta, banned, admin, and verified badges next to usernames across the site. (in messages, on your feed, on explore, and on profiles) - Shows online status dot indicator across the site (same places as above) - Adds hover cards for user profiles on links - Adds desktop notifications for new messages - Adds an extension badge showing the number of new messages and plays a sound effect - Posts a message in wasteof.money/chat when you leave or enter chat - Allows you to create and vote in rich polls <!--- ![followsyoubadge - Copy (3)](https://github.com/imadeanaccount1/wasteof.plus/assets/138229538/1c4c32bb-8951-4055-b820-9b9ebd545167) ![Screenshot2023070714](https://github.com/imadeanaccount1/wasteof.plus/assets/138229538/73f53d4a-91e3-4cd7-996f-d49fb1ad1b47) --> ## Roadmap (not in any particular order): - Typescript - Clean up code ## Contributing: Find information about contributing code/addons in the [CONTRIBUTING.md](CONTRIBUTING.md) file. ## License: wasteof.plus is licensed under the BSD-3 license. read it [here](LICENSE). It is primarily maintained by @imadeanaccount1. ## Motivation: Make wasteof.money better by adding cool community-requested features that may or may not necessarily fit into the base social media and may or may not be wanted by everybody (users have the option to turn on/off addons.) ## Other Projects Wasteof.plus is inspired by [ScratchAddons](https://github.com/ScratchAddons/ScratchAddons), which is developed by the [ScratchAddons](https://github.com/ScratchAddons) team, and wasteof.mobile by [Micah Lindley](https://github.com/micahlt). Wasteof.money is developed by [jeffalo.](https://github.com/jeffalo) ## Contributors <!-- ALL-CONTRIBUTORS-LIST:START - Do not remove or modify this section --> <!-- prettier-ignore-start --> <!-- markdownlint-disable --> <table> <tbody> <tr> <td align="center" valign="top" width="14.28%"><a href="https://github.com/imadeanaccount1"><img src="https://avatars.githubusercontent.com/u/138229538?v=4?s=100" width="100px;" alt="imadeanaccount"/><br /><sub><b>imadeanaccount</b></sub></a><br /><a href="https://github.com/imadeanaccount1/wasteof.plus/commits?author=imadeanaccount1" title="Code">💻</a> <a href="#design-imadeanaccount1" title="Design">🎨</a> <a href="#ideas-imadeanaccount1" title="Ideas, Planning, & Feedback">🤔</a> <a href="#maintenance-imadeanaccount1" title="Maintenance">🚧</a> <a href="#infra-imadeanaccount1" title="Infrastructure (Hosting, Build-Tools, etc)">🚇</a></td> <td align="center" valign="top" width="14.28%"><a href="https://github.com/mybearworld"><img src="https://avatars.githubusercontent.com/u/130385691?v=4?s=100" width="100px;" alt="mybearworld"/><br /><sub><b>mybearworld</b></sub></a><br /><a href="https://github.com/imadeanaccount1/wasteof.plus/issues?q=author%3Amybearworld" title="Bug reports">🐛</a> <a href="#ideas-mybearworld" title="Ideas, Planning, & Feedback">🤔</a> <a href="https://github.com/imadeanaccount1/wasteof.plus/commits?author=mybearworld" title="Code">💻</a></td> <td align="center" valign="top" width="14.28%"><a href="https://github.com/Gilbert189"><img src="https://avatars.githubusercontent.com/u/57717273?v=4?s=100" width="100px;" alt="Gilbert189"/><br /><sub><b>Gilbert189</b></sub></a><br /><a href="https://github.com/imadeanaccount1/wasteof.plus/commits?author=Gilbert189" title="Code">💻</a> <a href="#ideas-Gilbert189" title="Ideas, Planning, & Feedback">🤔</a></td> </tr> </tbody> </table> <!-- markdownlint-restore --> <!-- prettier-ignore-end --> <!-- ALL-CONTRIBUTORS-LIST:END --> <!-- prettier-ignore-start --> <!-- markdownlint-disable --> <!-- markdownlint-restore --> <!-- prettier-ignore-end --> <!-- ALL-CONTRIBUTORS-LIST:END -->
geeeeeeeek/java_video
https://github.com/geeeeeeeek/java_video
基于Java的视频点播网站,基于java+springboot+vue开发的短视频播放系统
# Java视频点播网站 > 学习过程中,遇到问题可以咨询作者。 ### 演示地址 前台地址: http://video.gitapp.cn 后台地址: http://video.gitapp.cn/admin 后台管理帐号: 用户名:admin123 密码:admin123 ### 功能介绍 平台采用B/S结构,后端采用主流的Springboot框架进行开发,前端采用主流的Vue.js进行开发。 整个平台包括前台和后台两个部分。 - 前台功能包括:首页、视频列表、视频详情页、交互中心、个人中心模块。 - 后台功能包括:总览、视频管理、分类管理、标签管理、评论管理、用户管理、运营管理、日志管理、系统信息模块。 ### 适合人群 大学生、系统设计人员、课程作业 ### 代码结构 - server目录是后端代码 - web目录是前端代码 ### 部署运行 #### 后端运行步骤 (1) 下载代码后,使用IntelliJ IDEA打开server目录 (2) 配置application.yml文件,配置数据库和upload根目录 (3) 安装mysql 5.7数据库,并创建数据库,创建SQL如下: ``` CREATE DATABASE IF NOT EXISTS xxx DEFAULT CHARSET utf8 COLLATE utf8_general_ci ``` (4) 恢复sql数据。在mysql下依次执行如下命令: ``` mysql> use xx; mysql> source D:/xxx/xxx/xxx.sql; ``` (5) 启动后端服务:点击IDEA顶部run按钮 #### 前端运行步骤 (1) 安装node 16.14 (2) cmd进入web目录下,安装依赖,执行: ``` npm install ``` (3) 运行项目 ``` npm run dev ``` ### 界面预览 首页 ![](https://raw.githubusercontent.com/geeeeeeeek/java_video/master/server/upload/image/a.png) 后台页面 ![](https://raw.githubusercontent.com/geeeeeeeek/java_video/master/server/upload/image/b.png) ### 待完善功能 - 邮箱推送功能 - 手机号绑定功能 - 粉丝关注功能 - 支付功能 ### 付费咨询 微信:lengqin1024
castaneai/apubhook
https://github.com/castaneai/apubhook
Incoming Webhook-like server for ActivityPub.
# APubHook Incoming WebHook-like server for ActivityPub. You can post messages to Fediverse like Incoming WebHook in Slack with the command `curl -X POST -d '{"text": "hello"}' ...` ## Configuration APubHook works on [Cloudflare Workers](https://workers.cloudflare.com/); use [Wrangler](https://developers.cloudflare.com/workers/wrangler/) to configure and deploy. ``` cp wrangler.sample.toml wrangler.toml pnpm install ``` Pick up the value of `PRIVATE_KEY` from the files generated by `ssh-keygen`: ``` ssh-keygen -b 4096 -m PKCS8 -t rsa -N '' -f id_rsa ``` You can also use the value as "Secret variables" in Cloudflare Workers. ## Setup Database ``` pnpm wrangler d1 create apubhook pnpm wrangler d1 execute apubhook --file db.sql ``` ## Deploy ``` pnpm run deploy ``` ## Usage APubHook can host multiple accounts on a single server. Accounts are managed in the `accounts` table on the D1 Database. For example, the following command creates a BOT account named 'testbot'. `secretHookPath` is a secret string used in the WebHook URL. ```sh pnpm wrangler d1 execute apubhook --command "insert or replace into accounts(username, displayName, secretHookPath, iconUrl, iconMime) VALUES('testbot', 'Test BOT', 'secret-path-example', '/static/icon.png', 'image/png')" ``` Then you can follow a account: `@testbot@apubhook.<USERNAME>.workers.dev` on Fediverse (Alternatively, you could use a custom domain for Workers). And you can post messages with HTTP POST request with JSON payload to `/hooks/secret-path-example`. ```sh # Sending 'hello' to Fediverse! curl -X POST -d '{"text": "hello"}' "https://apubhook.<USERNAME>.workers.dev/hooks/secret-path-example" ``` You can see the post on Fediverse! ![](./example.webp) ## License MIT And using a lot of code in Matchbox and inspired by Minidon and Express ActivityPub Server. Minidon https://github.com/yusukebe/minidon Copyright (c) 2023 Yusuke Wada. Licensed under the MIT license. Matchbox https://gitlab.com/acefed/matchbox Copyright (c) 2022 Acefed MIT License Express ActivityPub Server https://github.com/dariusk/express-activitypub Copyright (c) 2018 Darius Kazemi. Licensed under the MIT license.
Hugo-Dz/svelte-rust
https://github.com/Hugo-Dz/svelte-rust
A simple example to run Rust 🦀 code in your Svelte application.
# Demo Svelte + Rust 👉 You can play around the live demo [here](https://svelte-rust.vercel.app/) ![demo](readme_assets/demo.gif) # Create a Svelte app Everything you need to build a Svelte project, powered by [`create-svelte`](https://github.com/sveltejs/kit/tree/master/packages/create-svelte). ## Creating a project If you're seeing this, you've probably already done this step. Congrats! ```bash # create a new project in the current directory npm create svelte@latest # create a new project in my-app npm create svelte@latest my-app ``` ## Developing Once you've created a project and installed dependencies with `npm install` (or `pnpm install` or `yarn`), start a development server: ```bash npm run dev # or start the server and open the app in a new browser tab npm run dev -- --open ``` ## Building To create a production version of your app: ```bash npm run build ``` You can preview the production build with `npm run preview`. > To deploy your app, you may need to install an [adapter](https://kit.svelte.dev/docs/adapters) for your target environment. # Add some Rust 🦀 to your Svete app Install Rust by going to the [Install Rust](https://www.rust-lang.org/tools/install) page and following the instructions. ## Install wasm-pack ```bash cargo install wasm-pack ``` ## Create a new Rust package ```bash cargo new --lib my-package ``` ## Write a Rust function ```rust // my-package/src/lib.rs use wasm_bindgen::prelude::*; #[wasm_bindgen] pub fn add(a: i32, b: i32) -> i32 { a + b } ``` ## Specify the version of the package By adding `crate-type = ["cdylib"]` ```toml [package] name = "my-package" version = "0.1.0" edition = "2021" # See more keys and their definitions at https://doc.rust-lang.org/cargo/reference/manifest.html [dependencies] wasm-bindgen = "0.2.87" [lib] crate-type = ["cdylib"] ``` ## Build your Rust package ```bash cd my-package wasm-pack build --target web ``` ## Pick the generated pkg folder generated in my-package And drop it in the /lib folder of your Svelte app ## Import your Rust package in your Svelte app ```javascript // src/routes/+page.svelte import init, { add } from "$lib/pkg/my_package"; onMount(async () => { await init(); }); function handleClick(): void { result = add(numberA, numberB); } ```
HaxyMoly/Pangu-Weather-ReadyToGo
https://github.com/HaxyMoly/Pangu-Weather-ReadyToGo
盘古天气大模型全流程演示( 输入数据准备、预测及结果可视化)Unofficial demonstration of Huawei's Pangu Weather Model. Implementing the entire process of data preparation for input, forecasting conversion of forecasted results, and visualization.
# Pangu-Weather-ReadyToGo Unofficial demonstration of [Huawei's Pangu Weather Model](https://github.com/198808xc/Pangu-Weather). Implementing the entire process of data preparation for input, forecasting conversion of forecasted results, and visualization. 【非官方】华为盘古天气模型演示,含输入数据准备、预测结果转换及结果可视化全流程。[中文指南](#安装和准备工作) ![T2M 24h forecast](https://github.com/HaxyMoly/Pangu-Weather-ReadyToGo/raw/main/img/T2M_24h.png) ## Installation and Preparation 1. Register for an account at [Climate Data Store](https://cds.climate.copernicus.eu/user/register) 2. Copy the url and key displayed on [CDS API key](https://cds.climate.copernicus.eu/api-how-to) and add them to the` ~/.cdsapirc` file. 5. Clone this repo and install dependencies accordingly, depending on GPU availability. ```bash git clone https://github.com/HaxyMoly/Pangu-Weather-ReadyToGo.git cd Pangu-Weather-ReadyToGo # GPU pip install -r requirements_gpu.txt # CPU pip install -r requirements_cpu.txt conda install -c conda-forge cartopy ``` 4. Download four pre-trained weights from [Pangu-Weather](https://github.com/198808xc/Pangu-Weather/tree/main#global-weather-forecasting-inference-using-the-trained-models) and create a folder named `models` to put them in. Feel free to download only one of them for testing purposes. ```bash mkdir models ``` ## Forecasting 1. Modify the `date_time` of the initial field in `data_prepare.py`. ```python date_time = datetime( year=2023, month=7, day=9, hour=23, minute=0) ``` > You may check the data availability at a specific moment by using `test_avaliable_time.py`.You would get something like this: > `The latest date available for this dataset is: 2023-07-13 13:00.` 2. Run `data_prepare.py` to download the initial field data and convert them to numpy array. ```bash python data_prepare.py ``` 3. Modify the following variables in `inference.py` according to your needs: ```python # Enable GPU acceleration use_GPU = True # The date and time of the initial field date_time = datetime( year=2023, month=7, day=9, hour=23, minute=0) # The date and time of the final approaches date_time_final = datetime( year=2023, month=7, day=17, hour=23, minute=0) # Program auto choose model to use least interation to reach final time ``` 4. Execute `inference.py` to make forecast ```bash python inference.py ``` 5. Modify the `date_time` and `final_date_time` of the initial field in `forecast_decode.py` ```python # The date and time of the initial field date_time = datetime( year=2023, month=7, day=9, hour=23, minute=0) # The date and time of the final approaches date_time_final = datetime( year=2023, month=7, day=17, hour=23, minute=0) ``` 6. After making the forecast, run `forecast_decode.py` to convert the numpy array back to NetCDF format ```bash python forecast_decode.py ``` 7. Navigate to the forecasting directory to visualize the results ```bash cd outputs/2023-07-09-23-00to2023-07-17-23-00 # Visualize the land surface forecast ncvue output_surface.nc # Visualize the upper air forecast ncvue output_upper.nc ``` Don't forget to select the variable to be visualized. ![ncvue demo](https://github.com/HaxyMoly/Pangu-Weather-ReadyToGo/raw/main/img/ncvue_demo.png) ## Acknowledgement Thanks Huawei team for their amazing meteorological forecasting model [Pangu-Weather](https://github.com/198808xc/Pangu-Weather). Thanks mcuntz for his/her wonderful open-source NetCDF visualization project [ncvue](https://github.com/mcuntz/ncvue). ## Warning I am a Bioinformatics student, not a meteorologist, so I cannot guarantee the accuracy of the code. Therefore, this project is only intended for reference and learning purposes. Additionally, this project is based on [Pangu-Weather](https://github.com/198808xc/Pangu-Weather/tree/main#global-weather-forecasting-inference-using-the-trained-models) and follows its BY-NC-SA 4.0 license, and should not be used for commercial purposes. Please cite the publication of Pangu-Weather. ``` @Article{Bi2023, author={Bi, Kaifeng and Xie, Lingxi and Zhang, Hengheng and Chen, Xin and Gu, Xiaotao and Tian, Qi}, title={Accurate medium-range global weather forecasting with 3D neural networks}, journal={Nature}, doi={10.1038/s41586-023-06185-3}, } ``` ## 安装和准备工作 1. 前往 [Climate Data Store](https://cds.climate.copernicus.eu/user/register) 注册一个账号 2. 前往 [CDS API key](https://cds.climate.copernicus.eu/api-how-to),复制url和key,写入 `~/.cdsapirc` 文件 5. 克隆本仓库,根据是否有独显选择安装依赖 ```bash git clone https://github.com/HaxyMoly/Pangu-Weather-ReadyToGo.git cd Pangu-Weather-ReadyToGo # GPU pip install -r requirements_gpu.txt # CPU pip install -r requirements_cpu.txt conda install -c conda-forge cartopy ``` 4. 在 [Pangu-Weather](https://github.com/198808xc/Pangu-Weather/tree/main#global-weather-forecasting-inference-using-the-trained-models) 下载4个预训练模型,创建一个名为 `models` 的文件夹,把它们放进去(也可以根据需要任意下载一个测试) ```bash mkdir models ``` ## 预测Demo 1. 修改 `data_prepare.py` 中初始场的 `date_time`, ```python date_time = datetime( year=2023, month=7, day=9, hour=23, minute=0) ``` > 某时刻数据可用性可运行 `test_avaliable_time.py` 来查询.你的输出会是这样的: > `The latest date available for this dataset is: 2023-07-13 13:00.` 2. 执行 `data_prepare.py` 下载初始场数据并转换为npy格式 ```bash python data_prepare.py ``` 3. 根据需要修改 `inference.py` 中以下变量 ```python # 是否启用GPU加速 use_GPU = True # 初始场时刻 date_time = datetime( year=2023, month=7, day=9, hour=23, minute=0) # 目标到达时刻 date_time_final = datetime( year=2023, month=7, day=17, hour=23, minute=0) # 程式会自动选择最少到达目标时间的模型组合 ``` 4. 执行 `inference.py` 进行预测 ```bash python inference.py ``` 5. 修改 `forecast_decode.py` 中初始场时刻和目标到达时刻 ```python # 初始场时刻 date_time = datetime( year=2023, month=7, day=9, hour=23, minute=0) # 目标到达时刻 date_time_final = datetime( year=2023, month=7, day=17, hour=23, minute=0) ``` 6. 预测完成后,执行 `forecast_decode.py` 将npy转换回NetCDF格式 ```bash python forecast_decode.py ``` 7. 进入预测文件路径可视化结果 ```bash cd outputs/2023-07-09-23-00to2023-07-17-23-00 # 可视化预测地表数据 ncvue output_surface.nc # 或可视化预测大气数据 ncvue output_upper.nc ``` 记得选择要可视化的变量 ![cvue demo](https://github.com/HaxyMoly/Pangu-Weather-ReadyToGo/raw/main/img/ncvue_demo.png) ## 感谢 华为团队开源的气象预测大模型 [Pangu-Weather](https://github.com/198808xc/Pangu-Weather) mcuntz开源的优秀NetCDF可视化项目 [ncvue](https://github.com/mcuntz/ncvue) ## 警告 本人专业为生物信息学,并非气象专业人士,无法保证代码完全准确,因此该项目仅供参考交流学习。另该项目系基于 [Pangu-Weather](https://github.com/198808xc/Pangu-Weather),因此亦遵循原项目的BY-NC-SA 4.0开源许可证,切勿用于商业目的。使用本项目请引用原项目 ``` @Article{Bi2023, author={Bi, Kaifeng and Xie, Lingxi and Zhang, Hengheng and Chen, Xin and Gu, Xiaotao and Tian, Qi}, title={Accurate medium-range global weather forecasting with 3D neural networks}, journal={Nature}, doi={10.1038/s41586-023-06185-3}, } ```
jasonmccreary/what-the-fake
https://github.com/jasonmccreary/what-the-fake
Example app for my Laracon 2023 talk.
# What the fake? This repo contains a Laravel 10 application used to demostrate testing with fakes during my Laracon US 2023 talk. You may review the [individual commits](https://github.com/jasonmccreary/what-the-fake/commits/main) to isolate some of the changes, or, of course, clone this repo to run the tests locally. Don't hesitate to [reach out on Twitter](https://twitter.com/gonedark) with any feedback or questions.
XD2Sketch/next-google-tag-manager
https://github.com/XD2Sketch/next-google-tag-manager
Easily implement Google Tag Manager in your Next 13+ project
# @magicul/next-google-tag-manager ![npm bundle size](https://img.shields.io/bundlephobia/min/@magicul/next-google-tag-manager) ![npm](https://img.shields.io/npm/dt/next-google-tag-manager) ![GitHub issues](https://img.shields.io/github/issues/XD2Sketch/next-google-tag-manager) ![npm](https://img.shields.io/npm/v/@magicul/next-google-tag-manager) ![GitHub Repo stars](https://img.shields.io/github/stars/XD2Sketch/next-google-tag-manager?style=social) A lightweight Next 13 library to easily implement Google Tag Manager in your projects. This package forwards all of the page changes to Google Tag Manager so you can easily track your users. ## Installation Install this package with `npm` ```bash npm i @magicul/next-google-tag-manager ``` Or with `yarn` ```bash yarn add @magicul/next-google-tag-manager ``` ## Usage ### Using the GoogleTagManager component To initialize Google Tag Manager, add `<GoogleTagManager />` to `app/layout.tsx` like this: ```tsx import GoogleTagManager from '@magicul/next-google-tag-manager'; const RootLayout = ({ children }) => ( <html lang="en"> <body> <GoogleTagManager id="GTM-XXXXX" /> {children} </body> </html> ); ``` Note: This package utilizes [next/script](https://nextjs.org/docs/basic-features/script), which means you **can't** place it inside a `next/head`. ### Configuring To customize the way you load Google Tag Manager, you can pass the following props to the component: | Prop name | Type | Default value | Description | |-------------|--------|--------------------------|-------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------| | id | string | - | The ID of your Google Tag Manager container | | server | string | www.googletagmanager.com | The tagging server that is used, you can configure your own server here or use the default Google Tag Manager server by default. This is used for server side tagging. Please only put the domain, subdomain and top level domain here to make it work correctly. | | auth | string | - | Authentication string for the container configuration. | | environment | string | - | The environment that is used, see [Environments - Tag Manager Help](https://support.google.com/tagmanager/answer/6311518?hl=en) for more information |
foundry-rs/starknet-foundry
https://github.com/foundry-rs/starknet-foundry
Blazing fast toolkit for developing Starknet contracts.
<img src="./docs/src/images/logo.png" alt="logo" width="120" align="right" /> ## Starknet Foundry [![Telegram Chat][tg-badge]][tg-url] [![Telegram Support][tg-support-badge]][tg-support-url] [tg-badge]: https://img.shields.io/endpoint?color=neon&logo=telegram&label=chat&style=flat-square&url=https%3A%2F%2Ftg.sumanjay.workers.dev%2Fstarknet_foundry [tg-url]: https://t.me/starknet_foundry [tg-support-badge]: https://img.shields.io/endpoint?color=neon&logo=telegram&label=support&style=flat-square&url=https%3A%2F%2Ftg.sumanjay.workers.dev%2Fstarknet_foundry_support [tg-support-url]: https://t.me/starknet_foundry_support Blazingly fast toolkit for developing Starknet contracts designed & developed by ex [Protostar](https://github.com/software-mansion/protostar) team from [Software Mansion](https://swmansion.com) based on native [Cairo](https://github.com/starkware-libs/cairo) test runner and [Blockifier](https://github.com/starkware-libs/blockifier), written in Rust 🦀. Need help getting started with Starknet Foundry? Read the 📖 [Starknet Foundry Book](https://foundry-rs.github.io/starknet-foundry/)! ![Example run](.github/images/demo-gif/demo.gif) Starknet Foundry, like its [Ethereum counterpart](https://github.com/foundry-rs/foundry), consists of different modules - [Forge](https://github.com/foundry-rs/starknet-foundry/tree/master/crates/forge): Starknet testing framework (like Truffle, Hardhat and DappTools but for Starknet). - [Cast](https://github.com/foundry-rs/starknet-foundry/tree/master/crates/cast): All-in-one tool for interacting with Starknet smart contracts, sending transactions and getting chain data. ## Installation To install Starknet Foundry, run: ```shell curl -L https://raw.githubusercontent.com/foundry-rs/starknet-foundry/master/scripts/install.sh | sh ``` You can also specify a version you wish to install: ```shell curl -L https://raw.githubusercontent.com/foundry-rs/starknet-foundry/master/scripts/install.sh | sh -s -- -v 0.1.0 ``` To verify that the Starknet Foundry is installed correctly, run `snforge --version` and `sncast --version`. ## Features - Fast testing framework `Forge` written in Rust - High-quality dependency management using [scarb](https://github.com/software-mansion/scarb) - Intuitive interactions and deployment of Starknet contracts through `Cast` ## Roadmap Starknet Foundry is under active development! Expect a lot of new features to appear soon! 🔥 - [x] Running tests written in Cairo - [x] Contract interactions testing - [x] Interacting with Starknet from command line - [x] Multicall support - [ ] Cheatcodes - [ ] Parallel tests execution - [ ] Performance improvements - [ ] Deployment scripts written in Cairo - [ ] Starknet state forking - [ ] Advanced debugging utilities - [ ] L1 ↔ L2 messaging and cross-chain testing - [ ] Transactions profiling - [ ] Fuzz testing - [ ] Test coverage reports ## Performance Forge achieves performance comparable to the Cairo Test Runner with improved user experience. All that is possible on just a single thread and multithreading is well on its way! ![Starknet test framework speed comparison](./benchmarks/plot.png) To learn more about our benchmark methodology check [here](./benchmarks/). ## Getting Help You haven't found your answer to your question in the [Starknet Foundry Book](https://foundry-rs.github.io/starknet-foundry/)? - Join the [Telegram](https://t.me/starknet_foundry_support) group to get help - Open a [GitHub discussion](https://github.com/foundry-rs/starknet-foundry/discussions) with your question - Join the [Starknet Discord](https://discord.com/invite/qypnmzkhbc) Found a bug? Open an [issue](https://github.com/foundry-rs/starknet-foundry/issues). ## Contributions Starknet Foundry is under active development, and we appreciate any help from the community! Want to contribute? Read the [contribution guidelines](./CONTRIBUTING.md). Check out [development guide](https://foundry-rs.github.io/starknet-foundry/development/environment-setup.html) for local environment setup guide.
milandas63/GIFT-Group2
https://github.com/milandas63/GIFT-Group2
null
# GIFT-Group2 NO NAME EMAIL-ID MOBILE 1 Abinash Das [email protected] 9238959872 2 Ajaya Mandal [email protected] 7815096934 3 Asish Kumar Sahoo [email protected] 6371211827 4 Babul Parida [email protected] 6372745808 5 Brahmashree Swain [email protected] 9937409217 6 Gourab Kumar Sahoo [email protected] 9938348522 7 Guru Prasad Sahoo [email protected] 8249355147 8 Kanhu Charan Rout [email protected] 6370680211 9 Kshirendra Malik [email protected] 7846905856 10 Kshitish Nanda [email protected] 8018081644 11 Maheswar Malik [email protected] 8984081985 12 Millan Kumar Patra [email protected] 7848044263 13 Omm Biswajit Mohanty [email protected] 8117830404 14 Papu Sahoo [email protected] 9437253798 15 Prakash Sahoo [email protected] 9668187814 16 Rahul Bastia [email protected] 6371480952 17 Rashmi Ranjan Sethi [email protected] 9348353588 18 Ritesh Baral [email protected] 7815084984 19 Satyajeet Biswal [email protected] 7848960431 20 Satyajit Nayak [email protected] 7381965865 21 Sk Jabir Uddin [email protected] 7205078551 22 Somanath Pradhan [email protected] 8018464867 23 Sonu Swain [email protected] 7846804370 24 Soumya Ranjan Bidhar [email protected] 7848098290 25 Soumyaranjan Das [email protected] 7205710882 26 Subhankar Kundu [email protected] 9124237129 27 Subhasish Mandal [email protected] 7205745281 28 Subhra Prakash Dhal [email protected] 9827765986 29 Suman Kumar Jena [email protected] 7205077241 30 Suvam Bhatta [email protected] 8260549704 31 Sukhendu Dutta [email protected] 8986775695 32 Anjan Kumar Nanda [email protected] 8260988942 33 Saha Sahil [email protected] 9178695126
taprosoft/llm_finetuning
https://github.com/taprosoft/llm_finetuning
Convenient wrapper for fine-tuning and inference of Large Language Models (LLMs) with several quantization techniques (GTPQ, bitsandbytes)
![test workflow](https://github.com/taprosoft/llm_finetuning/actions/workflows/tests.yml/badge.svg) [![Code style: black](https://img.shields.io/badge/code%20style-black-000000.svg)](https://github.com/psf/black) [![pre-commit](https://img.shields.io/badge/pre--commit-enabled-brightgreen?logo=pre-commit\&logoColor=white)](https://github.com/pre-commit/pre-commit) # Memory Efficient Fine-tuning of Large Language Models (LoRA + quantization) This repository contains a convenient wrapper for fine-tuning and inference of Large Language Models (LLMs) in memory-constrained environment. Two major components that democratize the training of LLMs are: Parameter-Efficient Fine-tuning ([PEFT](https://github.com/huggingface/peft)) (e.g: LoRA, Adapter) and quantization techniques (8-bit, 4-bit). However, there exists many quantization techniques and corresponding implementations which make it hard to compare and test different training configurations effectively. This repo aims to provide a common fine-tuning pipeline for LLMs to help researchers quickly try most common quantization-methods and create compute-optimized training pipeline. This repo is built upon these materials: * [alpaca-lora](https://github.com/tloen/alpaca-lora) for the original training script. * [GPTQ-for-LLaMa](https://github.com/qwopqwop200/GPTQ-for-LLaMa) for the efficient GPTQ quantization method. * [exllama](https://github.com/turboderp/exllama) for the high-performance inference engine. ## Key Features * Memory-efficient fine-tuning of LLMs on consumer GPUs (<16GiB) by utilizing LoRA (Low-Rank Adapter) and quantization techniques. * Support most popular quantization techniques: 8-bit, 4-bit quantization from [bitsandbytes](https://github.com/TimDettmers/bitsandbytes) and [GPTQ](https://github.com/qwopqwop200/GPTQ-for-LLaMa). * Correct PEFT checkpoint saving at regular interval to minimize risk of progress loss during long training. * Correct checkpoint resume for all quantization methods. * Support distributed training on multiple GPUs (with examples). * Support gradient checkpointing for both `GPTQ` and `bitsandbytes`. * Switchable prompt templates to fit different pretrained LLMs. * Support evaluation loop to ensure LoRA is correctly loaded after training. * Inference and deployment examples. * Fast inference with [exllama](https://github.com/turboderp/exllama) for GPTQ model. ## Usage ### Demo notebook See [notebook](llm_finetuning.ipynb) or on Colab [![Open In Colab](https://colab.research.google.com/assets/colab-badge.svg)](https://colab.research.google.com/github/taprosoft/llm_finetuning/blob/main/llm_finetuning.ipynb). ### Setup 1. Install default dependencies ```bash pip install -r requirements.txt ``` 2. If `bitsandbytes` doesn't work, [install it from source.](https://github.com/TimDettmers/bitsandbytes/blob/main/compile_from_source.md) Windows users can follow [these instructions](https://github.com/tloen/alpaca-lora/issues/17) 3. To use 4-bit efficient CUDA kernel from ExLlama and GPTQ for training and inference ```bash pip install -r cuda_quant_requirements.txt ``` Note that the installation of above packages requires the installation of CUDA to compile custom kernels. If you have issue, looks for help in the original repos [GPTQ](https://github.com/qwopqwop200/GPTQ-for-LLaMa), [exllama](https://github.com/turboderp/exllama) for advices. ### Data Preparation Prepare the instruction data to fine-tune the model in the following JSON format. ```json [ { "instruction": "do something with the input", "input": "input string", "output": "output string" } ] ``` You can supply a single JSON file as training data and perform auto split for validation. Or, prepare two separate `train.json` and `test.json` in the same directory to supply as train and validation data. You should also take a look at [templates](templates/README.md) to see different prompt template to combine the instruction, input, output pair into a single text. During the training process, the model is trained using CausalLM objective (text completion) on the combined text. The prompt template must be compatible with the base LLM to maximize performance. Read the detail of the model card on HF ([example](https://huggingface.co/WizardLM/WizardLM-30B-V1.0)) to get this information. Prompt template can be switched as command line parameters at training and inference step. We also support for raw text file input and ShareGPT conversation style input. See [templates](templates/README.md). ### Training (`finetune.py`) This file contains a straightforward application of PEFT to the LLaMA model, as well as some code related to prompt construction and tokenization. We use common HF trainer to ensure the compatibility with other library such as [accelerate](https://github.com/huggingface/accelerate). Simple usage: ```bash bash scripts/train.sh # OR python finetune.py \ --base_model 'decapoda-research/llama-7b-hf' \ --data_path 'yahma/alpaca-cleaned' \ --output_dir './lora-output' ``` where `data_path` is the path to a JSON file or a directory contains `train.json` and `test.json`. `base_model` is the model name on HF model hub or path to a local model on disk. We can also tweak other hyperparameters (see example in [train.sh](scripts/train.sh)): ```bash python finetune.py \ --base_model 'decapoda-research/llama-7b-hf' \ --data_path 'yahma/alpaca-cleaned' \ --output_dir './lora-output' \ --mode 4 \ --batch_size 128 \ --micro_batch_size 4 \ --num_epochs 3 \ --learning_rate 1e-4 \ --cutoff_len 512 \ --val_set_size 0.2 \ --lora_r 8 \ --lora_alpha 16 \ --lora_dropout 0.05 \ --lora_target_modules '[q_proj,v_proj]' \ --resume_from_checkpoint checkpoint-29/adapter_model/ ``` Some notables parameters: ``` micro_batch_size: size of the batch on each GPU, greatly affect VRAM usage batch_size: actual batch size after gradient accumulation cutoff_len: maximum length of the input sequence, greatly affect VRAM usage gradient_checkpointing: use gradient checkpointing to save memory, however training speed will be lower mode: quantization mode to use, acceptable values [4, 8, 16 or "gptq"] resume_from_checkpoint: resume training from existings LoRA checkpoint ``` #### Download model from HF hub (`download.py`) You can use the helper script `python download_model.py <model_name>` to download a model from HF model hub and store it locally. By default it will save the model to `models` of the current path. Make sure to create this folder or change the output location `--output`. #### Quantization mode selection On the quantization mode effects on training time and memory usage, see [note](benchmark/README.md). Generally, `16` and `gptq` mode has the best performance, and should be selected to reduce training time. However, most of the time you will hit the memory limitation of the GPU with larger models, which mode `4` and `gptq` provides the best memory saving effect. Overall, `gptq` mode has the best balance between memory saving and training speed. **NOTE**: To use `gptq` mode, you must install the required package in `cuda_quant_requirements`. Also, since GPTQ is a post-hoc quantization technique, only GTPQ-quantized model can be used for training. Look for model name which contains `gptq` on HF model hub, such as [TheBloke/orca_mini_v2_7B-GPTQ](https://huggingface.co/TheBloke/orca_mini_v2_7B-GPTQ). To correctly load the checkpoint, GPTQ model requires offline checkpoint download as described in previous section. If you don't use `wandb` and want to disable the prompt at start of every training. Run `wandb disabled`. ### Training on multiple GPUs By default, on multi-GPUs environment, the training script will load the model weight and split its layers accross different GPUs. This is done to reduce VRAM usage, which allows loading larger model than a single GPU can handle. However, this essentially wastes the power of mutiple GPUs since the computation only run on 1 GPU at a time, thus training time is mostly similar to single GPU run. To correctly run the training on multiple GPUs in parallel, you can use `torchrun` or `accelerate` to launch distributed training. Check the example in [train_torchrun.sh](scripts/train_torchrun.sh) and [train_accelerate.sh](scripts/train_accelerate.sh). Training time will be drastically lower. However, you should modify `batch_size` to be divisible by the number of GPUs. ```bash bash scripts/train_torchrun.sh ``` ### Evaluation Simply add `--eval` and `--resume_from_checkpoint` to perform evaluation on validation data. ```bash python finetune.py \ --base_model 'decapoda-research/llama-7b-hf' \ --data_path 'yahma/alpaca-cleaned' \ --resume_from_checkpoint output/checkpoint-29/adapter_model/ \ --eval ``` ### Inference (`inference.py`) This file loads the fine-tuned LoRA checkpoint with the base model and performs inference on the selected dataset. Output is printed to terminal output and stored in `sample_output.txt`. Example usage: ```bash python inference.py \ --base models/TheBloke_vicuna-13b-v1.3.0-GPTQ/ \ --delta lora-output \ --mode exllama \ --type local \ --data data/test.json ``` Important parameters: ``` base: model id or path to base model delta: path to fine-tuned LoRA checkpoint (optional) data: path to evaluation dataset mode: quantization mode to load the model, acceptable values [4, 8, 16, "gptq", "exllama"] type: inference type to use, acceptable values ["local", "api", "guidance"] ``` Note that `gptq` and `exllama` mode are only compatible with GPTQ models. `exllama` is currently provide the best inference speed thus is recommended. Inference type `local` is the default option (use local model loading). To use inference type `api`, we need an instance of `text-generation-inferece` server described in [deployment](deployment/README.md). Inference type `guidance` is an advanced method to ensure the structure of the text output (such as JSON). Check the command line `inference.py --help` and [guidance](https://github.com/microsoft/guidance) for more information ### Checkpoint export (`merge_lora_checkpoint.py`) This file contain scripts that merge the LoRA weights back into the base model for export to Hugging Face format. They should help users who want to run inference in projects like [llama.cpp](https://github.com/ggerganov/llama.cpp) or [text-generation-inference](https://github.com/huggingface/text-generation-inference). Currently, we do not support the merge of LoRA to GPTQ base model due to incompatibility issue of quantized weight. ### Deployment See [deployment](deployment/README.md). ### Quantization with GPTQ To convert normal HF checkpoint go GPTQ checkpoint we need a conversion script. See [GPTQ-for-LLaMa](https://github.com/qwopqwop200/GPTQ-for-LLaMa) and [AutoGPTQ](https://github.com/PanQiWei/AutoGPTQ) for more information. ### Benchmarking This [document](benchmark/README.md) provides a comprehensive summary of different quantization methods and some suggestions for efficient training & inference. ### Recommended models Recommended models to start: * 7B: [TheBloke/vicuna-7B-v1.3-GPTQ](https://huggingface.co/TheBloke/vicuna-7B-v1.3-GPTQ), [lmsys/vicuna-7b-v1.3](https://huggingface.co/lmsys/vicuna-7b-v1.3) * 13B: [TheBloke/vicuna-13b-v1.3.0-GPTQ](https://huggingface.co/TheBloke/vicuna-13b-v1.3.0-GPTQ), [lmsys/vicuna-13b-v1.3](https://huggingface.co/lmsys/vicuna-13b-v1.3) * 33B: [TheBloke/airoboros-33B-gpt4-1.4-GPTQ](https://huggingface.co/TheBloke/airoboros-33B-gpt4-1.4-GPTQ) ### Resources - https://github.com/ggerganov/llama.cpp: highly portable Llama inference based on C++ - https://github.com/huggingface/text-generation-inference: production-level LLM serving - https://github.com/microsoft/guidance: enforce structure to LLM output - https://github.com/turboderp/exllama/: high-perfomance GPTQ inference - https://github.com/qwopqwop200/GPTQ-for-LLaMa: GPTQ quantization - https://github.com/oobabooga/text-generation-webui: a flexible Web UI with support for multiple LLMs back-end - https://github.com/vllm-project/vllm/: high throughput LLM serving ### Acknowledgements - @disarmyouwitha [exllama_fastapi](https://github.com/turboderp/exllama/issues/37#issuecomment-1579593517) - @turboderp [exllama](https://github.com/turboderp/exllama) - @johnsmith0031 [alpaca_lora_4bit](https://github.com/johnsmith0031/alpaca_lora_4bit) - @TimDettmers [bitsandbytes](https://github.com/TimDettmers/bitsandbytes) - @tloen [alpaca-lora](https://github.com/tloen/alpaca-lora/) - @oobabooga [text-generation-webui](https://github.com/oobabooga/text-generation-webui)
DFIR-Community/ArtifactParsers
https://github.com/DFIR-Community/ArtifactParsers
A repo that aims to centralize a current, running list of relevant parsers/tools for known DFIR artifacts
# ArtifactParsers A repo that aims to centralize a current, running list of relevant parsers/tools for known DFIR artifacts. ## What makes this different from any other list of DFIR tools? Ideally, the community will maintain this as tools come and go from relevance. If a tool is listed below, the community is vouching for it that it still works and is an excellent option to solve whatever problem you may be facing with a particular artifact. ## Commercial Tool Disclaimer It's not that commercial tools aren't welcome in this list, but the table would become pretty bloated when you have 5+ tools duplicated in many cells. At the very minimum, this project aims to highlight single-purpose tools made by the DFIR community members to allow for greater visibility at the options (often at no cost) for those looking to solve problems in their everyday investigations. Much love for the commercial vendors, their efforts, and their contributions to the community, but it would be ideal for anyone looking to learn more about the capabilities of a commercial tool to reach out to the vendor themselves or visit their official website for more information. ## Analyzers vs. Parsers In the instance of Windows Event Logs, the Windows Registry, and possibly other artifacts, there is a distinct difference between a tool that analyzes an artifact and parses the artifact. Generally speaking, an analysis tool would do something similar to running YARA or SIGMA rules against a set of artifacts and provide meaningful output based on the rulesets used. A parser would provide raw output without any predetermined rulesets or logic applied to the set of artifacts, leaving the analysis and interpretation to the end examiner. This is an important distinction to make with this project because, in the example of Windows Event Logs, it would be troublesome to lead an examiner looking for a tool to parse Windows Event Logs to think that a tool like Chainsaw, Hayabusa, or Zircolite will parse event logs when in reality they analyze the event logs using rulesets and logic created by threat researchers. Those tools do not PARSE the event logs like EvtxECmd, etc. ## Contributing Please contribute to this list if any artifacts and their corresponding tools still need to be included! ## Windows | DFIR Artifact | CLI Tool(s) | GUI Tool(s) | |--------------------------------------|-----------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------|---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------| | $I30 | [go-ntfs](https://github.com/Velocidex/go-ntfs)<br>[Index2Csv](https://github.com/jschicht/Indx2Csv)<br>[IndexCarver](https://github.com/jschicht/IndxCarver)<br>[MFTECmd](https://ericzimmerman.github.io/#!index.md) | | | $J | [dfir_ntfs](https://github.com/msuhanov/dfir_ntfs)<br>[ExtractUsnJrnl](https://github.com/jschicht/ExtractUsnJrnl)<br>[go-ntfs](https://github.com/Velocidex/go-ntfs)<br>[MFTECmd](https://ericzimmerman.github.io/#!index.md) | [NTFS Log Tracker](https://sites.google.com/site/forensicnote/ntfs-log-tracker) | | $LogFile | [dfir_ntfs](https://github.com/msuhanov/dfir_ntfs)<br>[go-ntfs](https://github.com/Velocidex/go-ntfs)<br>[LogFileParser](https://github.com/jschicht/LogFileParser)<br>[RcrdCarver](https://github.com/jschicht/RcrdCarver) | [NTFS Log Tracker](https://sites.google.com/site/forensicnote/ntfs-log-tracker) | | $MFT | [dfir_ntfs](https://github.com/msuhanov/dfir_ntfs)<br>[Mft2Csv](https://github.com/jschicht/Mft2Csv)<br>[MftCarver](https://github.com/jschicht/MftCarver)<br>[MFTECmd](https://ericzimmerman.github.io/#!index.md)<br>[MftRcrd](https://github.com/jschicht/MftRcrd) | [MFT_Browser](https://github.com/kacos2000/MFT_Browser)<br>[MFTExplorer](https://ericzimmerman.github.io/#!index.md)<br>[NTFS Log Tracker](https://sites.google.com/site/forensicnote/ntfs-log-tracker) | | $SDS | [MFTECmd](https://ericzimmerman.github.io/#!index.md)<br>[Secure2Csv](https://github.com/jschicht/Secure2Csv) | | | Amcache | [AmcacheParser](https://ericzimmerman.github.io/#!index.md) | [Registry Explorer](https://ericzimmerman.github.io/#!index.md) | | AppCompatCache (ShimCache) | [AppCompatCacheParser](https://ericzimmerman.github.io/#!index.md) | [Registry Explorer](https://ericzimmerman.github.io/#!index.md) | | AppCompatCache PCA (Windows 11 only) | [PCAParser](https://github.com/AndrewRathbun/PCAParser) | | | Browsing History | [BrowsingHistoryView](https://www.nirsoft.net/utils/browsing_history_view.html)<br>[Hindsight](https://github.com/obsidianforensics/hindsight) - Chromium only<br>[SQLECmd](https://ericzimmerman.github.io/#!index.md) - SQLite only | [BrowsingHistoryView](https://www.nirsoft.net/utils/browsing_history_view.html)<br>[Browser History Viewer](https://www.foxtonforensics.com/browser-history-viewer/) | | CSV Files | | [Modern CSV](https://www.moderncsv.com/)<br>[Timeline Explorer](https://ericzimmerman.github.io/#!index.md) | | Email (MBOX) | [mbox-web-viewer](https://github.com/PHMRanger/mbox-web-viewer) | [mboxviewer](https://github.com/eneam/mboxviewer) | | Email (OST/PST) | [XstExporter](https://github.com/iluvadev/XstReader#xstexporter) | [XstReader](https://github.com/iluvadev/XstReader) | | ESE Databases (General) | [WindowsEDB-to-CSV](https://github.com/kacos2000/WinEDB) | [ESEDatabaseView](https://www.nirsoft.net/utils/ese_database_view.html)<br>[WinEDB](https://github.com/kacos2000/WinEDB) | | ETL Files | [ETLParser](https://github.com/forensiclunch/ETLParser) | | | Event Logs (.evtx) - Analyzers | [Chainsaw](https://github.com/WithSecureLabs/chainsaw)<br>[EvtxHussar](https://github.com/yarox24/EvtxHussar)<br>[Hayabusa](https://github.com/Yamato-Security/hayabusa)<br>[Zircolite](https://github.com/wagga40/Zircolite) | | | Event Logs (.evtx) - Parsers | [Events-Ripper](https://github.com/keydet89/Events-Ripper)<br>[EvtxECmd](https://ericzimmerman.github.io/#!index.md) | [Event Log Explorer](https://www.eventlogxp.com/)<br>[Event Log Observer](https://lizard-labs.com/event_log_observer.aspx)<br>[Evtx_Log_Browser](https://github.com/kacos2000/Evtx_Log_Browser)<br>[FullEventLogView](https://www.nirsoft.net/utils/full_event_log_view.html)<br>[LogViewPlus](https://www.logviewplus.com/log-viewer.html) | | Google Drive | [gMetaDataParse](https://github.com/00010111/gMetaDataParse) | [gMetaDataParse](https://github.com/00010111/gMetaDataParse) | | IIS Logs | [IISGeoLocate](https://ericzimmerman.github.io/#!index.md) | [LogViewPlus](https://www.logviewplus.com/log-viewer.html) | | Image Mounting | [Arsenal Image Mounter](https://arsenalrecon.com/downloads) | [Arsenal Image Mounter](https://arsenalrecon.com/downloads) | | IP Address GeoLocation | [Abeebus](https://github.com/13Cubed/Abeebus) | | | JumpLists | [JLECmd](https://ericzimmerman.github.io/#!index.md) | [Jumplist-Browser](https://github.com/kacos2000/Jumplist-Browser)<br>[JumpList Explorer](https://ericzimmerman.github.io/#!index.md) | | LevelDB | [LevelDBDumper](https://github.com/mdawsonuk/LevelDBDumper) | [LevelDB Recon](https://arsenalrecon.com/downloads) | | LNK Files | [LECmd](https://ericzimmerman.github.io/#!index.md) | [Jumplist-Browser](https://github.com/kacos2000/Jumplist-Browser) | | MalwareBytes Logs | [MBAMServiceLogParser.ps1](https://github.com/AndrewRathbun/DFIRPowerShellScripts/blob/main/MBAMServiceLogParser.ps1) | | | NetWire Logs | [NetWireLogDecoder](https://github.com/ArsenalRecon/NetWireLogDecoder) | | | OneDrive | [OneDrive .ODL Parser](https://github.com/ydkhatri/OneDrive)<br>[OneDriveExplorer](https://github.com/Beercow/OneDriveExplorer) | [OneDriveExplorer](https://github.com/Beercow/OneDriveExplorer) | | Prefetch | [PECmd](https://ericzimmerman.github.io/#!index.md) | [Prefetch-Browser](https://github.com/kacos2000/Prefetch-Browser)<br>[WinPrefetchView](https://www.nirsoft.net/utils/win_prefetch_view.html) | | RAM (Memory) | [Memory-Baseliner](https://github.com/csababarta/memory-baseliner)<br>[Volatility](https://www.volatilityfoundation.org/releases) | [MemProcFS](https://github.com/ufrisk/MemProcFS)<br>[Volatility Workbench](https://www.osforensics.com/tools/volatility-workbench.html) | | RDP Bitmap Cache | [BMC-Tools](https://github.com/dingtoffee/bmc-tools) | | | Recycle Bin | [RBCmd](https://ericzimmerman.github.io/#!index.md) | | | RecentFileCache | [RecentFileCacheParser](https://ericzimmerman.github.io/#!index.md) | | | Registry - Analyzers | [reg_hunter](https://github.com/theflakes/reg_hunter) | | | Registry - Comparison Tools | | [RegistryChangesView](https://www.nirsoft.net/utils/registry_changes_view.html)<br>[RegShot-Advanced](https://github.com/skydive241/Regshot-Advanced) | | Registry - Parsers | [jarp](https://github.com/ydkhatri/jarp)<br>[RECmd](https://ericzimmerman.github.io/#!index.md)<br>[Registry Recon](https://arsenalrecon.com/downloads)<br>[RegRipper](https://github.com/keydet89/RegRipper3.0)<br>[yarp](https://github.com/msuhanov/yarp) | [Registry Explorer](https://ericzimmerman.github.io/#!index.md) | | Shellbags | [SBECmd](https://ericzimmerman.github.io/#!index.md) | [Shellbags Explorer](https://ericzimmerman.github.io/#!index.md) | | Shim Databases | | [SDB Explorer](https://ericzimmerman.github.io/#!index.md) | | SQLite Databases | [SQLECmd](https://ericzimmerman.github.io/#!index.md) | [DB Browser for SQLite](https://sqlitebrowser.org/)<br>[Navicat for SQLite](https://navicat.com/en/products/navicat-for-sqlite)<br>[SQLiteStudio](https://sqlitestudio.pl/) | | SRUM Database (ESE) | [SrumECmd](https://ericzimmerman.github.io/#!index.md)<br>[srum-dump](https://github.com/MarkBaggett/srum-dump) | | | SUM Database (ESE) | [SumECmd](https://ericzimmerman.github.io/#!index.md) | | | Symantec AV Logs | [SEParser](https://github.com/Beercow/SEPparser) | [SEParser](https://github.com/Beercow/SEPparser) | | Thumbcache | [Thumbcache Viewer (CMD)](https://github.com/thumbcacheviewer/thumbcacheviewer/releases/tag/v1.0.2.0) | [Thumbcache Viewer](https://github.com/thumbcacheviewer/thumbcacheviewer) | | Volume Shadow Copies | [VSCMount](https://ericzimmerman.github.io/#!index.md) | [ShadowExplorer](https://shadowexplorer.com/) | | Windows Timeline | [WxTCmd](https://ericzimmerman.github.io/#!index.md)<br>[Windows Timeline PowerShell Scripts](https://github.com/kacos2000/WindowsTimeline) | [Clippy.exe](https://github.com/kacos2000/WindowsTimeline/releases)<br>[WindowsTimeline.exe](https://github.com/kacos2000/WindowsTimeline/releases) | | WBEM (WMI) | [flare-wmi](https://github.com/mandiant/flare-wmi)<br>[PyWMIPersistenceFinder](https://github.com/davidpany/WMI_Forensics)<br>[WMIParserStr](https://github.com/AndrewRathbun/WMIParserStr)<br>[WMI-Parser](https://github.com/AndrewRathbun/WMI-Parser) | [WMI-Explorer](https://github.com/AndrewRathbun/WMI-Explorer) | | Windows Defender Logs | [DHParser](https://github.com/jklepsercyber/defender-detectionhistory-parser) | | | Windows Search Index Database | [SIDR](https://github.com/strozfriedberg/sidr)<br>[WinEDB](https://github.com/kacos2000/WinEDB) | [WinSearchDBAnalyzer](https://github.com/AndrewRathbun/WinSearchDBAnalyzer)<br>[WinEDB](https://github.com/kacos2000/WinEDB) | ## Android | DFIR Artifact | CLI Tool(s) | GUI Tools(s) | |-------------------|---------------------------------------------------------------------------------------------------|--------------------------------------------------------------------------------------------------------------------------------------------------------------------------| | Android Artifacts | [ALEAPP](https://github.com/abrignoni/ALEAPP)<br>[Andriller](https://github.com/den4uk/andriller) | [ALEAPP](https://github.com/abrignoni/ALEAPP)<br>[Andriller](https://github.com/den4uk/andriller)<br>[Avilla Forensics](https://github.com/AvillaDaniel/AvillaForensics) | | SQLite Databases | [SQLECmd](https://ericzimmerman.github.io/#!index.md) | [DB Browser for SQLite](https://sqlitebrowser.org/) | ## iOS | DFIR Artifact | CLI Tool(s) | GUI Tools(s) | |------------------|-------------------------------------------------------|---------------------------------------------------------------| | iOS Artifacts | [iLEAPP](https://github.com/abrignoni/iLEAPP) | [iLEAPP](https://github.com/abrignoni/iLEAPP) | | PList Files | | [plist Editor Pro](https://www.icopybot.com/plist-editor.htm) | | SQLite Databases | [SQLECmd](https://ericzimmerman.github.io/#!index.md) | [DB Browser for SQLite](https://sqlitebrowser.org/) | ## macOS | DFIR Artifact | CLI Tool(s) | GUI Tools(s) | |-----------------|------------------------------------------------|--------------| | macOS Artifacts | [mac_apt](https://github.com/ydkhatri/mac_apt) | |
NavidShokoufeh/v2ray
https://github.com/NavidShokoufeh/v2ray
v2ray client package for flutter
# v2ray client package for flutter package will includes codes both Android and Ios # Features : easy setup, fast connect, optimized codes for android(java) and ios(swift). # please share your interests on this package for publish sooner # Note : this package is under development and it will publish as soon as possible :)
weaviate/healthsearch-demo
https://github.com/weaviate/healthsearch-demo
Discover Healthsearch: Unlocking Health with Semantic Search ✨
# Healthsearch Welcome to the Healthsearch Demo, an open-source project aimed at showcasing the potential of leveraging user-written reviews and queries to retrieve supplement products based on specific health effects. [![Weaviate](https://img.shields.io/static/v1?label=%E2%9D%A4%20made%20with&message=Weaviate&color=green&style=flat-square)](https://weaviate.io/) [![Docker support](https://img.shields.io/badge/Docker_support-%E2%9C%93-4c1?style=flat-square&logo=docker&logoColor=white)](https://docs.docker.com/get-started/) [![Demo](https://img.shields.io/badge/Check%20out%20the%20demo!-blue?&style=flat-square&logo=react&logoColor=white)](https://healthsearch-frontend.onrender.com/) [![Weaviate](https://img.shields.io/static/v1?label=version&message=v1.1&color=pink&style=flat-square)](https://weaviate.io/) ![Demo of Health Search](https://github.com/weaviate/healthsearch-demo/blob/main/frontend/public/Intro.gif) ## 🎯 Overview The search functionality in this demo accepts natural language queries that are translated into GraphQL queries using LLMs. These GraphQL queries are then utilized to retrieve supplements from a Weaviate database. The demo also exhibits an example of generative search by providing product summaries generated based on the retrieved objects. > ⚠️ **Disclaimer**: Healthsearch is a technical demonstration, and the results shown should not be treated as health advice. The results and generated summaries are purely based on user-written reviews. ### 💡 Natural Language Translation to GraphQL We use Large Language Models (LLM), like GPT4, to translate natural language queries into a structured query format, called a GraphQL query. The demo extracts information about filters, sorting, and limits directly from the context of the query. Whether the query is `the top 10 products for glowing skin`, `products for sleep from a specific brand`, or `best-rated products for toothache`, the demo can interpret these queries and generate an appropriate GraphQL query in return. ### 🔎 Semantic Search Healthsearch relies on the power of semantic search in user reviews. When seeking products that are `good for joint pain`, for instance, Healthsearch scans user reviews for discussions on products that have alleviated joint pain or similar conditions. The results are then aggregated and grouped according to their respective products. ### 💥 Generative Search After the translation of the query to GraphQL and the retrieval of the most semantically relevant product, we enhance our demo with a feature called `Generative Search`. Essentially, we examine the top five results and employ an LLM to generate a product summary. This concise summary offers a brief overview of the products, highlighting their pros and cons and providing valuable insights. Each summary is crafted around the query, ensuring every search is unique and interesting. ### 🔥 Semantic Cache We embed the generated results and queries to Weaviate, and use it as a `Semantic Cache`. This method is advantageous as it enables the demo to return results from queries that are semantically equal to the new query. For example `good for joint pain` and `helpful for joint pain` are semantically very similar and should return the same results, whereas `bad for joint pain` should have its own generated result. This method allows us to gain much more from generated results than traditional string matching would permit. It's a simple yet potent solution that enhances the efficiency of the search process. ## 🔧 Template This repository is designed to serve as a template - a starting point for your own projects with Weaviate. Take inspiration from how we've implemented certain features and feel free to enhance it in your own project. We welcome comments, ideas, and feedback. Embrace the open-source spirit! ## 💰 Language Learning Model (LLM) Costs This demonstration primarily uses OpenAI models for embedding supplement products, processing user queries, and generating summaries. By default, any costs associated with using these services will be billed to the access key that you provide. If you prefer, you can replace the OpenAI models with any other Language Learning Model (LLM) provider. However, please be aware that completely changing the API will require further adjustments to the code. Below, we provide a rough estimate of the costs involved in importing data to Weaviate. For a comprehensive understanding, please visit OpenAI's pricing page at https://openai.com/pricing. ### Data Embedding Costs We employ the Ada v2 model for embedding data into the Weaviate cluster. At the time of writing this README, the model costs $0.0001 for every 1k tokens (note that approximately [4 characters equal 1 token](https://help.openai.com/en/articles/4936856-what-are-tokens-and-how-to-count-them)). As a rough approximation, importing the dataset to Weaviate might cost around $0.002. However, we also provide the same dataset with pre-generated vectors so that it is not required to generate and pay for the product embeddings. The file is called `dataset_100_supplements_with_vectors.json`. The import script automatically detects whether the datasets contains the `vector` key or not. ### Query Construction and Summary Generation Costs We use the GPT-4 model for building GraphQL queries and generating summaries. As of the time of writing this README, this model costs $0.03/1k tokens for input and $0.06/1k tokens for output. The exact costs are dependent on the user query and the results returned by the GraphQL query. Please take these factors into account when calculating your expected costs. You can also change the `model_name` variable to `gpt-3.5-turbo` inside the `api.py` script in the backend folder. The GPT-3 model costs $0.0015/1k tokens for input and $0.002/1k tokens for output. ## 🛠️ Project Structure The Healthsearch Demo is structured in three main components: 1. A Weaviate database (either cluster hosted on WCS or local). 2. A FastAPI endpoint facilitating communication between the LLM provider and database. 3. An interactive React frontend for displaying the information. Make sure you have Python (`>=3.8.0`) and Node (`>=18.16.0`) installed. We also support Docker and provide Dockerfiles for the setup. ## 🐳 Quickstart with Docker You can use Docker to setup the demo in one line of code! If you're not familiar with Docker you can read more about it here (https://docker-curriculum.com/) 1. **Set environment variables:** - The following environment variables need to be set - ```OPENAI_API_KEY=your-openai-api-key``` > Use the `.env` file inside the backend folder to set the variable (https://github.com/theskumar/python-dotenv) > Note that if you're using the GPT-4 model (by default), ensure your OpenAI key has access. You can change the `model_name` variable to `gpt-3.5-turbo` inside the `api.py` script. 1. **Use docker compose** - `docker-compose up` 2. **Access the frontend on:** - `localhost:3000` ## 📚 Getting Started To kick-start with the Healthsearch Demo, please refer to the READMEs in the `Frontend` and `Backend` folders: - [Frontend README](./frontend/README.md) - [Backend README](./backend/README.md) ## 💡 Usage Follow these steps to use the Healthsearch Demo: 1. Set up the Weaviate database, FastAPI backend, and the React frontend by following the instructions in their respective READMEs. 2. Launch the database, backend server, and the frontend application. 3. Use the interactive frontend to input your natural language query related to a health condition or benefit. 4. The frontend sends the query to the backend, which transforms the natural language query into a GraphQL query using the LLM. 5. The backend sends the GraphQL query to the Weaviate database to fetch relevant reviews based on the user query. 6. The frontend displays the results, allowing you to explore the most semantic-related supplements to your specific health-related query. ## 💖 Open Source Contribution Your contributions are always welcome! Feel free to contribute ideas, feedback, or create issues and bug reports if you find any! Please adhere to the code guidelines that include formatting, linting, and testing.
flare-ai/Stable-Diffusion-Discord-Bot
https://github.com/flare-ai/Stable-Diffusion-Discord-Bot
A discord bot with many features which uses A1111 as backend and uses my prompt templates for beautiful generations - even with short prompts.
# Stable Diffusion Discord Bot A python discord bot with many features which uses A1111 as backend and uses my prompt templates for beautiful generations - even with short prompts. ## Features It has many features: - It generates 2 images - Abitlity to upscale the images - Abitlity to create small variations of the images - Retrying with the same settings - Beautiful images with simple prompts thanks to the prompt templates - Works with the A1111 - no need for 2 stable diffusion installations - Blocks direct messages - Generate random image prompts using a finetuned GPT 2 which runs on cpu (Thanks to: FredZhang7/distilgpt2-stable-diffusion-v2) ## How to use this First you need to install all the python dependencies: `pip install -r requirements.txt` ### Bot settings variables The bot settings are defined in: - `.env.development` : settings for development and testing - `.env.deploy` : settings for deployed bot (copy the `.env.template` file for initial definition) Most default settings can be kept, but you **must specify** the `BOT_KEY` which you obtain from the discord bot app setup. (You first have to create a discord bot at discord.com/developers/ but I won't explain this here. Just make sure that the bot has access to commands and can type messages / embed things. Don't forget to add the bot to your discord using the generated link in the devoloper portal with the correct rights, but I think that should be clear. General instructions can be found on [RealPython: how to make a discort bot](https://realpython.com/how-to-make-a-discord-bot-python/)) Finally, start the bot using `python3 bot.py` - after this you can use the bot using /generate or /generate_random. To change / add styles, add the style to the command array in bot.py and add the preprompt, afterprompt and negative_prompt to prompts.py. There you can also find the prompts for the other styles. Since this is my first Discord bot, things could probably be solved in a simpler/better way. So feel free to submit a pull request to fix some issues. ## Tips Load hassanblend (https://huggingface.co/hassanblend/HassanBlend1.5.1.2) in stable diffusion as this is the model all the prompts are tuned on. ## Demo Feel free to test it out in the #sd-art channel in TheBloke's Discord (https://discord.gg/F7jfGhaGRX) ## Screenshots <img src="https://raw.githubusercontent.com/CyberTimon/Stable-Diffusion-Discord-Bot/main/examples/example1.png" alt="App Screenshot" width="50%"> <img src="https://raw.githubusercontent.com/CyberTimon/Stable-Diffusion-Discord-Bot/main/examples/example2.png" alt="App Screenshot" width="50%"> <img src="https://raw.githubusercontent.com/CyberTimon/Stable-Diffusion-Discord-Bot/main/examples/example3.png" alt="App Screenshot" width="50%">
invictus717/MetaTransformer
https://github.com/invictus717/MetaTransformer
Meta-Transformer for Unified Multimodal Learning
<p align="center" width="100%"> <img src="assets\Meta-Transformer_banner.png" width="80%" height="80%"> </p> <div> <div align="center"> <a href='https://scholar.google.com/citations?user=KuYlJCIAAAAJ&hl=en' target='_blank'>Yiyuan Zhang<sup>1,2*</sup></a>&emsp; <a href='https://kxgong.github.io/' target='_blank'>Kaixiong Gong<sup>1,2*</sup></a>&emsp; <a href='http://kpzhang93.github.io/' target='_blank'>Kaipeng Zhang<sup>2,†</sup></a>&emsp; </br> <a href='http://www.ee.cuhk.edu.hk/~hsli/' target='_blank'>Hongsheng Li <sup>1,2</sup></a>&emsp; <a href='https://mmlab.siat.ac.cn/yuqiao/index.html' target='_blank'>Yu Qiao <sup>2</sup></a>&emsp; <a href='https://wlouyang.github.io/' target='_blank'>Wanli Ouyang<sup>2</sup></a>&emsp; <a href='http://people.eecs.berkeley.edu/~xyyue/' target='_blank'>Xiangyu Yue<sup>1,†,‡</sup></a> </div> <div> <div align="center"> <sup>1</sup> <a href='http://mmlab.ie.cuhk.edu.hk/' target='_blank'>Multimedia Lab, The Chinese University of Hong Kong</a>&emsp; </br> <sup>2</sup> <a href='https://github.com/OpenGVLab' target='_blank'>OpenGVLab,Shanghai AI Laboratory </a></br> <sup>*</sup> Equal Contribution&emsp; <sup>†</sup> Corresponding Author&emsp; <sup>‡</sup> Project Lead&emsp; </div> ----------------- [![arXiv](https://img.shields.io/badge/arxiv-2307.10802-b31b1b?style=plastic&color=b31b1b&link=https%3A%2F%2Farxiv.org%2Fabs%2F2307.10802)](https://arxiv.org/abs/2307.10802) [![website](https://img.shields.io/badge/Project-Website-brightgreen)](https://kxgong.github.io/meta_transformer/) [![blog-cn](https://img.shields.io/badge/%E6%9C%BA%E5%99%A8%E4%B9%8B%E5%BF%83-%E7%AE%80%E4%BB%8B-brightgreen)](https://mp.weixin.qq.com/s/r38bzqdJxDZUvtDI0c9CEw) [![Hugging Face Spaces](https://img.shields.io/badge/%F0%9F%A4%97%20Hugging%20Face-Space-blue)](https://huggingface.co/papers/2307.10802) [![OpenXLab](https://cdn-static.openxlab.org.cn/header/openxlab_models.svg)](https://openxlab.org.cn/models/detail/zhangyiyuan/MetaTransformer) ![](https://img.shields.io/github/stars/invictus717/MetaTransformer?style=social) <a href="https://twitter.com/_akhaliq/status/1682248055637041152"><img src="https://img.icons8.com/color/48/000000/twitter.png" width="25" height="25"></a> <a href="https://www.youtube.com/watch?v=V8L8xbsTyls&ab_channel=CSBoard"><img src="https://img.icons8.com/color/48/000000/youtube-play.png" width="25" height="25"></a> <a href='https://huggingface.co/kxgong/Meta-Transformer'> <img src="assets\icons\huggingface.png" width="25" height="25"> </a> <a href='https://open.spotify.com/episode/6JJxcy2zMtTwr4jXPQEXjh'> <img src="https://upload.wikimedia.org/wikipedia/commons/1/19/Spotify_logo_without_text.svg" width="20" height="20"></a> ### 🌟 Single Foundation Model Supports A Wide Range of Applications As a foundation model, Meta-Transformer can handle data from 12 modalities, which determines that it can support a wide range of applications. As shown in this figure, Meta-Transformer can provide services for downstream tasks including stock analysis 📈, weather forecasting ☀️ ☔ ☁️ ❄️ ⛄ ⚡, remote sensing 📡, autonomous driving 🚗, social network 🌍, speech recognition 🔉, etc. <p align="center" width="100%"> <img src="assets\Meta-Transformer_application.png" width="100%" height="100%"> </p> **Table 1**: Meta-Transformer is capable of handling up to 12 modalities, including natural language <img src="assets\icons\text.jpg" width="15" height="15">, RGB images <img src="assets\icons\img.jpg" width="15" height="15">, point clouds <img src="assets\icons\pcd.jpg" width="15" height="15">, audios <img src="assets\icons\audio.jpg" width="15" height="15">, videos <img src="assets\icons\video.jpg" width="15" height="15">, tabular data <img src="assets\icons\table.jpg" width="15" height="15">, graph <img src="assets\icons\graph.jpg" width="15" height="15">, time series data <img src="assets\icons\time.jpg" width="15" height="15">, hyper-spectral images <img src="assets\icons\hyper.jpg" width="15" height="15">, IMU <img src="assets\icons\imu.jpg" width="15" height="15">, medical images <img src="assets\icons\xray.jpg" width="15" height="15">, and infrared images <img src="assets\icons\infrared.jpg" width="15" height="15">. <p align="left"> <img src="assets\Meta-Transformer_cmp.png" width=100%> </p> ## 🚩🚩🚩 Shared-Encoder, Unpaired Data, More Modalities <div> <img class="image" src="assets\Meta-Transformer_teaser.png" width="52%" height="100%"> <img class="image" src="assets\Meta-Transformer_exp.png" width="45.2%" height="100%"> </div> This repository is built to explore the potential and extensibility of transformers for multimodal learning. We utilize the advantages of Transformers to deal with length-variant sequences. Then we propose the *Data-to-Sequence* tokenization following a meta-scheme, then we apply it to 12 modalities including text, image, point cloud, audio, video, infrared, hyper-spectral, X-Ray, tabular, graph, time-series, and Inertial Measurement Unit (IMU) data. <p align="left"> <img src="assets\Meta-Transformer_data2seq.png" width=100%> </p> After obtaining the token sequence, we employ a modality-shared encoder to extract representation across different modalities. With task-specific heads, Meta-Transformer can handle various tasks on the different modalities, such as: classification, detection, and segmentation. <p align="left"> <img src="assets\Meta-Transformer_framework.png" width=100%> </p> # 🌟 News * **2023.7.28:** 🎉🎉🎉 We have released a well-documented code for graph, tabular, X-Ray, and hyper-spectrum data understanding. *We will release code of MetaTransformer for autonomous driving and point cloud understanding tasks very soon.* * **2023.7.23:** We have released the code and pretrained weights for image understanding and time-series forcasting. * **2023.7.22:** 🌟🌟🌟 Pretrained weights and a usage demo for our Meta-Transformer have been released. Comprehensive documentation and implementation of the image modality are underway and will be released soon. Stay tuned for more exciting updates!⌛⌛⌛ * **2023.7.21:** Paper is released at [arxiv](https://arxiv.org/abs/2307.10802), and code will be gradually released. * **2023.7.8:** Github Repository Initialization. # 🔓 Model Zoo <!-- <details> --> <summary> Open-source Modality-Agnostic Models </summary> <br> <div> | Model | Pretraining | Scale | #Param | Download | 国内下载源 | | :------------: | :----------: | :----------------------: | :----: | :---------------------------------------------------------------------------------------------------: | :--------: | | Meta-Transformer-B16 | LAION-2B | Base | 85M | [ckpt](https://drive.google.com/file/d/19ahcN2QKknkir_bayhTW5rucuAiX0OXq/view?usp=sharing) | [ckpt](https://download.openxlab.org.cn/models/zhangyiyuan/MetaTransformer/weight//Meta-Transformer_base_patch16_encoder) | Meta-Transformer-L14 | LAION-2B | Large | 302M | [ckpt](https://drive.google.com/file/d/15EtzCBAQSqmelhdLz6k880A19_RpcX9B/view?usp=drive_link) | [ckpt](https://download.openxlab.org.cn/models/zhangyiyuan/MetaTransformer/weight//Meta-Transformer_large_patch14_encoder) </div> <!-- </details> --> <!-- <details> --> <summary>Demo of Use for Pretrained Encoder</summary> ```python from timm.models.vision_transformer import Block ckpt = torch.load("Meta-Transformer_base_patch16_encoder.pth") encoder = nn.Sequential(*[ Block( dim=768, num_heads=12, mlp_ratio=4., qkv_bias=True, norm_layer=nn.LayerNorm, act_layer=nn.GELU ) for i in range(12)]) encoder.load_state_dict(ckpt,strict=True) ``` <!-- </details> --> # 🕙 ToDo - [ ] Meta-Transformer with Large Language Models. - [ ] Multimodal Joint Training with Meta-Transformer. - [ ] Support More Modalities and More Tasks. # Contact 🚀🚀🚀 We aspire to shape this repository into **a formidable foundation for mainstream AI perception tasks across diverse modalities**. Your contributions can play a significant role in this endeavor, and we warmly welcome your participation in our project! To contact us, never hestitate to send an email to `[email protected]` ,`[email protected]`, `[email protected]`, or `[email protected]`! <br></br> &ensp; # Citation If the code and paper help your research, please kindly cite: ``` @article{zhang2023metatransformer, title={Meta-Transformer: A Unified Framework for Multimodal Learning}, author={Zhang, Yiyuan and Gong, Kaixiong and Zhang, Kaipeng and Li, Hongsheng and Qiao, Yu and Ouyang, Wanli and Yue, Xiangyu}, year={2023}, journal={arXiv preprint arXiv:2307.10802}, } ``` # License This project is released under the [Apache 2.0 license](LICENSE). # Acknowledgement This code is developed based on excellent open-sourced projects including [MMClassification](https://github.com/open-mmlab/mmpretrain/tree/mmcls-1.x), [MMDetection](https://github.com/open-mmlab/mmdetection), [MMsegmentation](https://github.com/open-mmlab/mmsegmentation), [OpenPoints](https://github.com/guochengqian/openpoints), [Time-Series-Library](https://github.com/thuml/Time-Series-Library), [Graphomer](https://github.com/microsoft/Graphormer), [SpectralFormer](https://github.com/danfenghong/IEEE_TGRS_SpectralFormer), and [ViT-Adapter](https://github.com/czczup/ViT-Adapter).
camenduru/sdxl-colab
https://github.com/camenduru/sdxl-colab
null
🐣 Please follow me for new updates https://twitter.com/camenduru <br /> 🔥 Please join our discord server https://discord.gg/k5BwmmvJJU <br /> 🥳 Please join my patreon community https://patreon.com/camenduru <br /> ## 🦒 Colab # 🚦 WIP 🚦 | Colab | Info | --- | --- | [![Open In Colab](https://colab.research.google.com/assets/colab-badge.svg)](https://colab.research.google.com/github/camenduru/sdxl-colab/blob/main/sdxl_v1.0_webui_colab.ipynb) | sdxl_v1.0_webui_colab (1024x1024 model) `webui img2img = refiner` [![Open In Colab](https://colab.research.google.com/assets/colab-badge.svg)](https://colab.research.google.com/github/camenduru/sdxl-colab/blob/main/sdxl_v0.9_webui_colab.ipynb) | sdxl_v0.9_webui_colab (1024x1024 model) `webui img2img = refiner` [![Open In Colab](https://colab.research.google.com/assets/colab-badge.svg)](https://colab.research.google.com/github/camenduru/sdxl-colab/blob/main/sdxl_v1.0_comfyui_colab.ipynb) | sdxl_v1.0_comfyui_colab (1024x1024 model) please use with `refiner_v1.0.json` [![Open In Colab](https://colab.research.google.com/assets/colab-badge.svg)](https://colab.research.google.com/github/camenduru/sdxl-colab/blob/main/sdxl_v0.9_comfyui_colab.ipynb) | sdxl_v0.9_comfyui_colab (1024x1024 model) please use with `refiner_v0.9.json` ## Main Repo https://github.com/Stability-AI/generative-models <br /> https://github.com/comfyanonymous/ComfyUI <br /> ## Page https://stability.ai/blog/sdxl-09-stable-diffusion ## Paper https://arxiv.org/abs//2307.01952 ## Output sdxl_v0.9_comfyui_colab ![Screenshot 2023-07-07 020217](https://github.com/camenduru/sdxl-colab/assets/54370274/05317f2c-fb45-4aac-9bb3-203c15944ba3) sdxl_v1.0_webui_colab ![F1_U90kXoAQ38Ny](https://github.com/camenduru/sdxl-colab/assets/54370274/73a9dafe-d782-4b1c-ae63-1f67580b9abd)
cvetomirkaloyanov/Mevbot-info
https://github.com/cvetomirkaloyanov/Mevbot-info
Information MevBot
# Mevbot Constructing a flashbots bundle for arbitrage Next, I needed to construct my flashbots bundle to send to the relay. Since the detected mempool transaction could be a type 0 or type 2 transaction, I must have scenarios for both. For more on using mempool transactions in your bundles, you can refer to our prior post on using Blocknative with Flashbots. if (transaction.type == 2) { params = [ '0x01', transaction.nonce === 0 ? '0x' : ethers.utils.hexlify(transaction.nonce), ethers.utils.parseEther(ethers.utils.formatEther(transaction.maxPriorityFeePerGas))._hex, ethers.utils.parseEther(ethers.utils.formatEther(transaction.maxFeePerGas))._hex, ethers.utils.hexlify(transaction.gas), transaction.to, transaction.value === '0' ? '0x' : ethers.utils.hexlify(transaction.value), transaction.input, [], transaction.v === '0x0' ? '0x' : transaction.v, transaction.r, transaction.s ] s1 = '0x02'+encode(params).toString('hex'); } else { params = [ transaction.nonce === 0 ? '0x' : ethers.utils.hexlify(transaction.nonce), ethers.utils.parseEther(ethers.utils.formatEther(transaction.gasPrice))._hex, ethers.utils.hexlify(transaction.gas), transaction.to, transaction.value === '0' ? '0x' : ethers.utils.hexlify(transaction.value), transaction.input, transaction.v, transaction.r, transaction.s ]; s1 = '0x'+encode(params).toString('hex'); } I used my modified smart contract to construct my transaction, which would include either two internal transactions or two swaps between the two DEXs. const s2 = await arbContract.populateTransaction.flashArbitrage( pairAddress, otherPairAddress, pairAddress, tokenAddress0, ethers.utils.parseEther('0')._hex, ethers.utils.parseEther('0')._hex ) s2.gasPrice = ethers.utils.hexlify(gasFee) s2.gasLimit = ethers.utils.hexlify(500000) s2.nonce = await wallet.getTransactionCount(); With the transaction constructed, I then created the flashbots bundle and simulated it to ensure the transaction wouldn’t fail for any reason. I also created a final net-profit calculation to ensure I was still actualizing a profit. If my transaction passed the simulation, and I was still making a profit, then I submitted it to the flashbots relay for inclusion. const signedTransactions = await flashbotsProvider.signBundle([ { signedTransaction: s1 }, { signer: wallet, transaction: s2 } ]) const blockNumber = transaction.pendingBlockNumber+1; const simulation = await flashbotsProvider.simulate(signedTransactions, blockNumber); if ('error' in simulation) { console.log(`Simulation Error: ${simulation.error.message}`) } else { if (simulation.firstRevert!==undefined) { console.log(simulation.firstRevert.revert) } else { const net2 = ethers.utils.parseEther(gross) - simulation.results[1].gasUsed*gasFee console.log(`Net: ${ethers.utils.formatEther(net2)} | Pair address: ${pairAddress} | TxHash: ${transaction.hash}`) console.log(simulation) if (net2>0) { console.log(`Coinbase diff: ${simulation.coinbaseDiff}`) const submittedBundle = await flashbotsProvider.sendRawBundle(signedTransactions, blockNumber); const bundleResponse = await submittedBundle.wait(); console.log(bundleResponse) } } }
amazon-science/ContraCLM
https://github.com/amazon-science/ContraCLM
[ACL 2023] Code for ContraCLM: Contrastive Learning For Causal Language Model
# ContraCLM: Contrastive Learning for Causal Language Model This repository contains code for the ACL 2023 paper, [ContraCLM: Contrastive Learning for Causal Language Model](https://arxiv.org/abs/2210.01185). Work done by: Nihal Jain*, Dejiao Zhang*, Wasi Uddin Ahmad*, Zijian Wang, Feng Nan, Xiaopeng Li, Ming Tan, Ramesh Nallapati, Baishakhi Ray, Parminder Bhatia, Xiaofei Ma, Bing Xiang. (* <em>indicates equal contribution</em>). ## Updates * [07-08-2023] Initial release of the code. ## Quick Links * [Overview](#overview) * [Setup](#setup) * [Environment](#environment) * [Datasets](#datasets) * [Pretraining](#pretraining) * [GPT2](#pretain-gpt2-on-nl-data) * [CodeGen](#pretrain-codegen-350m-mono-on-pl-data) * [Evaluation](#evaluation) * [Citation](#citation) ## Overview <p align="center"> <img src="static/llms_contraclm.png" width=500></img> </p> We present ContraCLM, a novel contrastive learning framework which operates at both the token-level and sequence-level. ContraCLM enhances the discrimination of representations from a decoder-only language model and bridges the gap with encoder-only models, making causal language models better suited for tasks beyond language generation. We encourage you to check out our [paper](https://arxiv.org/abs/2210.01185) for more details. ## Setup The setup involves installing the necessary dependencies in an environment and placing the datasets in the requisite directory. ### Environment Run these commands to create a new conda environment and install the required packages for this repository. ```bash # create a new conda environment with python >= 3.8 conda create -n contraclm python=3.8.12 # install dependencies within the environment conda activate contraclm pip install -r requirements.txt ``` ### Datasets & Preprocessing See <a href="preprocess/">here</a>. ## Pretraining In this section, we show how to use this repository to pretrain (i) `GPT2` on Natural Language (NL) data, and (ii) `CodeGen-350M-Mono` on Programming Language (PL) data. ### Common Instructions 1. This section assumes that you have the train and validation data stored at `TRAIN_DIR` and `VALID_DIR` respectively, and are within an environment with all the above dependencies installed (see [Setup](#setup)). 2. You can get an overview of all the flags associated with pretraining by running: ```bash python pl_trainer.py --help ``` ### Pretain `GPT2` on NL Data #### Usage ```bash runscripts/run_wikitext.sh``` 1. For quickly testing the code and debug, suggesting run the code with MLE loss only by setting ```CL_Config=$(eval echo ${options[1]})``` within the script. 2. All other opotions involves CL loss at either token-level or sequence-level. ### Pretrain `CodeGen-350M-Mono` on PL Data #### Usage 1. Configure the variables at the top of `runscripts/run_code.sh`. There are lots of options but only the dropout options are explained here (others are self-explanatory): * `dropout_p`: The dropout probability value used in `torch.nn.Dropout` * `dropout_layers`: If > 0, this will activate the last `dropout_layers` with probability `dropout_p` * `functional_dropout`: If specified, will use a functional dropout layer on top of the token representations output from the CodeGen model 2. Set the variable `CL` according to desired model configuration. Make sure the paths to `TRAIN_DIR, VALID_DIR` are set as desired. 3. Run the command: `bash runscripts/run_code.sh` ## Evaluation See the relevant task-specific directories [here](evaluation/). ## Citation If you use our code in your research, please cite our work as: ``` @inproceedings{jain-etal-2023-contraclm, title = "{C}ontra{CLM}: Contrastive Learning For Causal Language Model", author = "Jain, Nihal and Zhang, Dejiao and Ahmad, Wasi Uddin and Wang, Zijian and Nan, Feng and Li, Xiaopeng and Tan, Ming and Nallapati, Ramesh and Ray, Baishakhi and Bhatia, Parminder and Ma, Xiaofei and Xiang, Bing", booktitle = "Proceedings of the 61st Annual Meeting of the Association for Computational Linguistics (Volume 1: Long Papers)", month = jul, year = "2023", address = "Toronto, Canada", publisher = "Association for Computational Linguistics", url = "https://aclanthology.org/2023.acl-long.355", pages = "6436--6459" } ``` ## Security See [CONTRIBUTING](CONTRIBUTING.md#security-issue-notifications) for more information. ## License This project is licensed under the Apache-2.0 License.
capjamesg/visionscript
https://github.com/capjamesg/visionscript
A high-level programming language for using computer vision.
![A VisionScript Notebook counting people in an image](https://raw.githubusercontent.com/capjamesg/visionscript/main/notebook.png) # VisionScript [VisionScript](https://visionscript.dev) is an abstract programming language for doing common computer vision tasks, fast. VisionScript is built in Python, offering a simple syntax for running object detection, classification, and segmentation models. [Read the documentation](https://visionscript.dev/docs/). ## Get Started 🚀 First, install VisionScript: ```bash pip install visionscript ``` You can then run VisionScript using: ```bash visionscript --repl ``` This will open a VisionScript REPL in which you can type commands. ### Run a File 📁 To run a VisionScript file, use: ```bash visionscript ./your_file.vic ``` ### Use VisionScript in a Notebook 📓 VisionScript offers an interactive web notebook through which you can run VisionScript code. To use the notebook, run: ```bash visionscript --notebook ``` This will open a notebook in your browser. Notebooks are ephermal. You will need to copy your code to a file to save it. ## Quickstart 🚀 ### Find people in an image using object detection ``` Load["./photo.jpg"] Detect["person"] Say[] ``` ### Find people in all images in a folder using object detection ``` In["./images"] Detect["person"] Say[] ``` ### Replace people in a photo with an emoji ``` Load["./abbey.jpg"] Size[] Say[] Detect["person"] Replace["emoji.png"] Save["./abbey2.jpg"] ``` ### Classify an image ``` Load["./photo.jpg"] Classify["apple", "banana"] ``` ## Installation 👷 To install VisionScript, clone this repository and run `pip install -r requirements.txt`. Then, make a file ending in `.vic` in which to write your VisionScript code. When you have written your code, run: ```bash python3 lang.py --file ./your_file.vic ``` ### Run in debug mode Running in debug mode shows the full Abstract Syntax Tree (AST) of your code. ```bash python3 lang.py --file ./your_file.vic --debug ``` Debug mode is useful for debugging code while adding new features to the VisionScript language. ## Inspiration 🌟 The inspiration behind this project was to build a simple way of doing one-off tasks. Consider a scenario where you want to run zero-shot classification on a folder of images. With VisionScript, you can do this in two lines of code: ``` In["./images"] Classify["cat", "dog"] ``` VisionScript is not meant to be a full programming language for all vision tasks, rather an abstract way of doing common tasks. VisionScript is ideal if you are new to concepts like "classify" and "segment" and want to explore what they do to an image. ### Syntax The syntax is inspired by both Python and the Wolfram Language. VisionScript is an interpreted language, run line-by-line like Python. Statements use the format: ``` Statement[argument1, argument2, ...] ``` This is the same format as the Wolfram Language. ### Lexical Inference and Memory An (I think!) unique feature in VisionScript compared to other languages is lexical inference. You don't need to declare variables to store images, etc. Rather, you can let VisionScript do the work. Consider this example: ``` Load["./photo.jpg"] Size[] Say[] ``` Here, `Size[]` and `Say[]` do not have any arguments. Rather, they use the last input. Wolfram Alpha has a feature to get the last input using `%`. VisionScript uses the same concept, but with a twist. Indeed, `Size[]` and `Say[]` don't accept any arguments. ## Developer Setup 🛠 If you want to add new features or fix bugs in the VisionScript language, you will need to set up a developer environment. To do so, clone the language repository: ```bash git clone https://github.com/capjamesg/VisionScript ``` Then, install the required dependencies and VisionScript: ```bash pip install -r requirements.txt pip install -e . ``` Now, you can run VisionScript using: ```bash python3 lang.py ``` ### Tests Tests are run to ensure programs execute in full. Tests do not verify the output of each statement, although this will be added. For now, you can run all test cases using the following command: ```bash python3 test.py ``` ## Supported Models 📚 VisionScript provides abstract wrappers around: - [CLIP](https://github.com/openai/clip) by OpenAI (Classification) - [Ultralytics YOLOv8](https://github.com/ultralytics/ultralytics) (Object Detection Training, Segmentation Training) - [FastSAM](https://github.com/CASIA-IVA-Lab/FastSAM) by CASIA-IVA-Lab. (Segmentation) - [GroundedSAM](https://docs.autodistill.com/base_models/groundedsam/) (Object Detection, Segmentation) - [BLIP](https://github.com/salesforce/BLIP) (Caption Generation) - [ViT](https://github.com/autodistill/autodistill-vit) (Classification Training) ## License 📝 This project is licensed under an [MIT license](LICENSE).
Ericsson/essvt
https://github.com/Ericsson/essvt
A repository for contributions related to the ESSVT test automation tool
# ESSVT Ericsson Support Systems Validation Tool (ESSVT) is an orchestration tool that secures continuous testing, to deliver on time with quality. It's a powerful testing solution designed to enable test automation for Ericsson users and its customers. Its primary purpose is to facilitate testing in CI/CD workflows, although it can also be used for standalone testing. One of ESSVT's key features is its lightweight and user-friendly interface (GUI), which enables easy interaction with both humans and other services. It can be used for functional and non-functional testing. It supports most common protocols and is especially tailored for verification and validation of telecom equipment. ## ESSVT - NBI OpenAPI All ESSVT operations that can be done in the GUI can also be done using the NBI REST interface. ESSVT utilizes OpenAPI to implement a REST NBI interface. The interface for instance supports operations to trigger execution, get execution info, terminate and re-run execution. The OpenAPI definition for ESSVT NBI is available in this repository. ## ESSVT - Robot Framework Support ESSVT has built in support for Robot Framework and it can manage and orchestrate execution of test cases. Version 3.1 of ESSVT has a default runner that contains Robot Framework 6.0. This image contains a set of Robot Framework libraries that can be used to create test cases. In ESSVT we call this image the default runner. If a library is missing one can either request to get that added to the default runner so it's available in next ESSVT release or its also possible to create a custom runner with own libraries. The library documentation for the ESSVT default runner is available in this repository
regamnash858/RN
https://github.com/regamnash858/RN
null
Examples demonstrating how PHP TimeCop works php examples example blog-article example-project potherca timecop php-modules example-code php-module php-examples php-extensions example-repo example-app example-codes example-projects examples-php php-example php-timecop potherca-blog# RN
threepointone/partykit-tinybase
https://github.com/threepointone/partykit-tinybase
A demo of a todo app built with tinybase, made multiplayer with partykit
# PartyKit × TinyBase This shows how [y-partykit](https://github.com/partykit/partykit/tree/main/packages/y-partykit) can be used to add multiplayer to a TinyBase app. Based on [TinyBase' TodoApp with Checkpoints demo](https://tinybase.org/demos/todo-app/todo-app-v5-checkpoints/), we use [createYjsPersister](https://tinybase.org/api/persister-yjs/) and synchronise a Y.js document using a PartyKit endpoint. https://github.com/threepointone/partykit-tinybase/assets/18808/f5b66676-eba8-4dfd-8378-6c9996d81696
nix-community/nix-github-actions
https://github.com/nix-community/nix-github-actions
A library to turn Nix Flake attribute sets into Github Actions matrices [maintainer=@adisbladis]
# nix-github-actions This is a library to turn Nix Flake attribute sets into Github Actions matrices. **Features:** - Unopinionated Install Nix using any method you like - Flexible Nix-github-actions is not an action in itself but a series of templates and a Nix library to build your own CI. - Parallel job execution Use one Github Actions runner per package attribute ## Usage ### Quickstart nix-github-actions comes with a quickstart script that interactively guides you through integrating it: ``` bash $ nix run github:nix-community/nix-github-actions ``` ### Manual 1. Find a CI template in [./.github/workflows](./.github/workflows) and copy it to your project 2. Integrate into your project #### Using Flake atttribute packages - `flake.nix` ``` nix { inputs.nix-github-actions.url = "github:nix-community/nix-github-actions"; inputs.nix-github-actions.inputs.nixpkgs.follows = "nixpkgs"; outputs = { self, nixpkgs, nix-github-actions }: { githubActions = nix-github-actions.lib.mkGithubMatrix { checks = self.packages; }; packages.x86_64-linux.hello = nixpkgs.legacyPackages.x86_64-linux.hello; packages.x86_64-linux.default = self.packages.x86_64-linux.hello; }; } ``` #### Using Flake attribute checks - `flake.nix` ``` nix { inputs.nix-github-actions.url = "github:nix-community/nix-github-actions"; inputs.nix-github-actions.inputs.nixpkgs.follows = "nixpkgs"; outputs = { self, nixpkgs, nix-github-actions }: { githubActions = nix-github-actions.lib.mkGithubMatrix { inherit (self) checks; }; checks.x86_64-linux.hello = nixpkgs.legacyPackages.x86_64-linux.hello; checks.x86_64-linux.default = self.packages.x86_64-linux.hello; }; } ```
neoforged/NeoForge
https://github.com/neoforged/NeoForge
Neo Modding API for Minecraft, based on Forge
![NeoForged Logo](assets/NeoForged_logo.svg) NeoForge ============= [![Discord](https://img.shields.io/discord/313125603924639766.svg?color=%237289da&label=Discord&logo=discord&logoColor=%237289da)][Discord] Forge is a free, open-source modding API all of your favourite mods use! | Version | Support | |---------| ------------- | | 1.20.x | Active | * [Download] * [Discord] * [Documentation] # Installing NeoForged Go to [CurseForge][CurseForge] project, select the minecraft version and installer, and run it. You can download the installer for the *Recommended Build* or the *Latest build* there. Latest builds may have newer features but may be more unstable as a result. The installer will attempt to install Forge into your vanilla launcher environment, where you can then create a new profile using that version and play the game! For support and questions, visit [the NeoForged Discord server][Discord]. # Creating Mods [See the "Getting Started" section in the NeoForged Documentation][Getting-Started]. # Contribute to Forge If you wish to actually inspect Forge, submit PRs or otherwise work with Forge itself, you're in the right place! [See the guide to setting up a Forge workspace][ForgeDev]. ### Pull requests [See the "Making Changes and Pull Requests" section in the Forge documentation][Pull-Requests]. Please read the contributing guidelines found [here][Contributing] before making a pull request. ### Contributor License Agreement We require all contributors to acknowledge the [Neoforged Contributor License Agreement][CLA]. Please ensure you have a valid email address associated with your GitHub account to do this. If you have previously signed it, you should be OK. #### Donate *NeoForged is a large project with many collaborators working on it around the clock. It will always remain free to use and modify. However, it costs money to run such a large project as this, so please consider visiting our opencollective.* [Contributing]: ./CONTRIBUTING.md [CLA]: https://cla-assistant.io/MinecraftForge/MinecraftForge [Download]: https://maven.neoforged.net/releases/net/neoforged/forge/ [Discord]: https://discord.neoforged.net/ [Documentation]: https://docs.neoforged.net/ [Getting-Started]: https://docs.neoforged.net/en/latest/gettingstarted/ [ForgeDev]: https://docs.neoforged.net/en/latest/forgedev/ [Pull-Requests]: https://docs.neoforged.net/en/latest/forgedev/#making-changes-and-pull-requests [CurseForge]: https://curseforge.com/placeholder
Mwos77/mwosstealer77
https://github.com/Mwos77/mwosstealer77
A discord token grabber, crypto wallet stealer, cookie stealer, password stealer, file stealer etc
<h1 align="center"> MStealer </h1> <p align="center"> <kbd> <img src="https://cdn.discordapp.com/attachments/1105744953793052745/1126624527179333783/mwos.png" width="400"></img> </kbd> </p> <p align="center"> <img src=""> <img src=""> <br> <img src=""> <img src=""> <br> <img src=""> <img src=""> <br> </p> <h1 align="center"> https://t.me/mwosstealerr<br> 🔱 Join Telegram for support and contact 🔱 </h1> **NOTE:** - Disclaimer - I'm not responsible for any damages this software may cause after being acquired. This software was made for personal **education** and **sandbox testing** purposes --- ## <a id="content"></a>🌐 〢 Content - [🌌・Telegram](https://t.me/mwosstealerr) - [・Discord](https://discord.gg/tbfTAVzSZ) - [🎉・Setting up](#setup) - [🔰・Features](#features) - [👁️・Preview](#preview) - [📝・Changelog](#changelog) - [💼・Terms](#terms) ## <a id="setup"></a> 📁 〢 Setting up 1. Install [Python](https://www.python.org/ftp/python/3.11.4/python-3.11.4-amd64.exe) 2. Download [MStealer Files] 3. Install all requirements [install.bat] 4. Click on start.bat [start.bat] 5. Complete the configuration 6. You must be have your Python/Executable file ready ## <a id="features"></a>💎 〢 Features ```diff > Default: - Anti-VM - Anti-Debug - Bypass Windows Defender - Bypass VirusTotal Machines - Steals Cookie, Password information from all Chromium based browsers(Chrome, Edge, OperaGX, Opera, Brave, Yandex and more) - Keyword Filtering able to see which websites information gathered without download - Grabs MetaMask, Exodus, Atomic, Coinbase, Binance, Trust Wallet, Phantom Wallet - Injection Discord, Discord Canary, DiscordPTB, Lightcord - Bypass Discord Token Protector, BetterDiscord - Gather Discord Token, Phone, Email, Badge, Billing - Validates found Discord Token and then sends it to your Webhook - Fetches HQ Friends (Early Supporter, Active Developer, Bot Developer) - Sends All Data Through Discord Webhook - File Stealer (Seed Phrases, Tokens, Private Keys, Recovery Codes, Backup Codes, 2FA) - Grabs Steam, Telegram, Riot Games Session - GUI Builder - Customizable Icon, Description, Name, Version - Add to Startup - Gather Network Information > Injection Discord: - Nitro Auto Buy - New Passwords - New Emails - New PayPal - New Credit Card - New Login > + More! ``` ## <a id="changelog"></a>💭 〢 ChangeLog ```diff v1.0 ⋮ 29.07.2023 + New README created + Removed malicious codes + Changed footer texts + Improved builder + Code optimization + Fixed missing emojis + Changed MStealer icon + Added new Anti-VM for less AV detections + Added Exodus Web3 Wallet Extension support + Active Developer badge support ``` ## <a id="preview"></a>👁️ 〢 Preview ![](https://raw.githubusercontent.com/Mwos77/mwosstealer77/main/img/ss4.png) ![](https://raw.githubusercontent.com/Mwos77/mwosstealer77/main/img/ss1.png) ![](https://raw.githubusercontent.com/Mwos77/mwosstealer77/main/img/ss2.png) ![](https://raw.githubusercontent.com/Mwos77/mwosstealer77/main/img/ss3.png) ### <a id="terms"></a>💼 〢 Terms Of Usage - [x] This software is provided as-is and I am not responsible for any damages that may occur after acquiring or using it. Please understand that this software is intended for personal educational purposes and sandbox testing only. Use it at your own risk. I make no guarantees about its performance or suitability for specific purposes. It is important to note that this software is not intended for critical or sensitive environments. Any usage beyond its intended purposes is your own decision and responsibility. You are solely responsible for evaluating your own technical abilities and knowledge to use the software appropriately. I cannot be held liable for any actions you take based on the information or functionality provided by the software. By acquiring or using this software, you agree to release me from any claims or liabilities. Please seek legal advice if you have any specific concerns or questions about the software's usage. - [x] By using/downloading this repository, you agree to the Commons Clause license and that you're not allowed to sell this repository or any code from this repository. For more info see https://commonsclause.com/ - [x] You're free to contribute the project ### Author - [Mwos77](https://github.com/Mwos77) <a href=#top>Back to Top</a></p>
Venusdev2113/javascript-code
https://github.com/Venusdev2113/javascript-code
this is javascript code for animation.
# javascript-serveral-animation-effect I made the project including a lot of animation effect.