content
stringlengths
10
4.9M
The 58 men and women who worked on the original draft of the new Florida science standards are now back at it. They’re going through all the public and professional input that has flooded in over the past few months. May the force be with them. This opinion article in the Miami Herald by writer Fred Grimm gave me some serious things to ponder. Oscar Howard Jr., superintendent of Taylor County’s School District, and Danny Lundy, vice chairman of the School Board, spoke in accents from that other Florida. ”We’re opposed to teaching evolution as a fact,” Howard said, adding that his School Board and 11 others have passed resolutions against the imposition of evolution in the school curriculum. Wait a minute. Did I read that right? Eleven other counties passed anti-evolution resolutions? How in the world did that escape any real public notice until now? I’m only aware of three right now. I can understand a few school boards hiding in the shadows and only attracting the attention of their little weekly newspapers. But 12 total? TWELVE?! And here I was feeling so clever for discovering one by poking around in search engines. I feel silly now. And here’s a real good point that should shame just about everyone that has been stomping their feet about how offensive evolution is to them. Then, the final speaker, Lisa Dizengoff, director of science curriculum at Pembroke Pines Charter School’s east campus, angrily reminded the crowd that after all the carping over evolution, no one had gotten around to addressing the state’s lackadaisical, last-century approach to science education. ”All I heard was this argument about evolution,” she said, disgusted that so many other problems had been preempted by a single controversy. “The kids lost out again.” Yes, I agree.
// Resolve a DNS query with the upstream resolver and strip out any extra or NS // records in the response. func (r *ResponseMinimize) Resolve(q *dns.Msg, ci ClientInfo) (*dns.Msg, error) { answer, err := r.resolver.Resolve(q, ci) if err != nil || answer == nil || answer.Rcode != dns.RcodeSuccess { return answer, err } logger(r.id, q, ci).Debug("stripping response") answer.Extra = nil answer.Ns = nil return answer, nil }
# =========================================================================== # # PUBLIC DOMAIN NOTICE # National Center for Biotechnology Information # # This software/database is a "United States Government Work" under the # terms of the United States Copyright Act. It was written as part of # the author's official duties as a United States Government employee and # thus cannot be copyrighted. This software/database is freely available # to the public for use. The National Library of Medicine and the U.S. # Government have not placed any restriction on its use or reproduction. # # Although all reasonable efforts have been taken to ensure the accuracy # and reliability of the software and data, the NLM and the U.S. # Government do not and cannot warrant the performance or results that # may be obtained by using this software or data. The NLM and the U.S. # Government disclaim all warranties, express or implied, including # warranties of performance, merchantability or fitness for any particular # purpose. # # Please cite the author in any work or product based on this material. # # =========================================================================== # # from ctypes import cdll, c_char, c_int, c_char_p, c_int32, c_int64, c_double, POINTER, c_size_t, c_void_p, c_uint64, c_uint32 import os, sys, platform, tempfile, subprocess if sys.version_info[0] > 2: from urllib.parse import urlencode from urllib.request import urlopen else: from urllib import urlencode from urllib2 import urlopen from .ErrorMsg import check_res_embedded from .ErrorMsg import ErrorMsg def process_bits(): """Return bitness of the running python process, or None if unknown.""" architecture2bits = {'64bit': 64, '32bit': 32} return architecture2bits.get(platform.architecture()[0], None) def lib_filename(lib_name): if platform.system() == "Windows": return lib_name+LibManager.get_lib_extension() else: return "lib"+lib_name+LibManager.get_lib_extension() def load_saved_library(lib_name): """search library in different possible locations and load it if found """ lib = None for dir in LibManager.get_directories_to_find_dll(): try: lib = cdll.LoadLibrary(os.path.join(dir, lib_filename(lib_name) )) if lib: break except OSError: pass return lib def load_updated_library(lib_name): """download library from ncbi and load it """ directory_created = (False, None) file_created = (False, None) file_saved = (False, None) for dir in LibManager.get_directories_to_find_dll(): try: if not dir: continue if not os.path.exists(dir): os.makedirs(dir) directory_created = (True, dir) except: directory_created = (False, dir) continue lib_path = os.path.join(dir, lib_filename(lib_name)) try: f = open(lib_path, "wb") file_created = (True, lib_path) except: file_created = (False, lib_path) continue params = urlencode({ 'cmd': 'lib', 'libname': lib_name, 'os_name': LibManager.get_post_os_name_param(), 'bits': str(process_bits()) }) resp = urlopen(url=LibManager.URL_NCBI_SRATOOLKIT, data=params.encode()) if resp.code != 200: raise ErrorMsg("Failed to download dll: url=" + LibManager.URL_NCBI_SRATOOLKIT + "; params=" + params + "; response code=" + str(resp.code)) try: f.write(resp.read()) f.close() file_saved = (True, lib_path) except: file_saved = (False, lib_path) continue return cdll.LoadLibrary(lib_path) if not directory_created[0]: raise ErrorMsg("Failed to create directory '" + directory_created[1] + "' for " + lib_filename(lib_name)) elif not file_created[0]: raise ErrorMsg("Failed to create file " + file_created[1]) elif not file_saved[0]: raise ErrorMsg("Failed to save file " + file_saved[1]) def version_tuple(version_str): assert isinstance(version_str, str) or isinstance(version_str, unicode) import re arr = re.split('\.\s*|\-\s*|\s+', version_str) arr_typed = [] for x in arr: try: arr_typed.append(int(x)) except ValueError: arr_typed.append(x) # TODO: apply a smarter parsing here if needed return tuple ( arr_typed ) def get_library_version_tuple_remote(lib_name): params = urlencode({ 'cmd': 'vers', 'libname': lib_name, 'os_name': LibManager.get_post_os_name_param(), 'bits': str(process_bits()) }) resp = urlopen(url=LibManager.URL_NCBI_SRATOOLKIT, data=params.encode()) if resp.code != 200: raise ErrorMsg("Failed to query dll version: url=" + LibManager.URL_NCBI_SRATOOLKIT + "; params=" + params + "; response code=" + str(resp.code)) version_str = resp.read().strip() if isinstance(version_str, bytes): version_str = version_str.decode(encoding='utf8') return version_tuple (version_str) def should_download_library(): do_download = os.environ.get("NGS_PY_DOWNLOAD_LIBRARY", "1") return do_download.lower() in ("1", "yes", "true", "on") def load_library(lib_name, do_download, silent): if do_download: library = load_updated_library(lib_name) else: library = load_saved_library(lib_name) if not library and not silent: raise ErrorMsg("Failed to load library " + lib_name + " (NGS_PY_DOWNLOAD_LIBRARY=" + os.environ.get("NGS_PY_DOWNLOAD_LIBRARY", "<not set>") + ", " + "NGS_PY_LIBRARY_PATH=" + os.environ.get("NGS_PY_LIBRARY_PATH", "<not set>") + ", " + "do_download=" + str(do_download) + ")") else: return library class LibManager: c_lib_engine = None c_lib_sdk = None URL_NCBI_SRATOOLKIT = 'https://trace.ncbi.nlm.nih.gov/Traces/sratoolkit/sratoolkit.cgi' def _bind(self, c_lib, c_func_name_str, param_types_list, errorcheck): setattr(self, c_func_name_str, getattr(c_lib, c_func_name_str)) func = getattr(self, c_func_name_str) func.argtypes = param_types_list func.restype = c_int if errorcheck: func.errcheck = errorcheck def bind_sdk(self, c_func_name_str, param_types_list): return self._bind(self.c_lib_sdk, c_func_name_str, param_types_list, check_res_embedded) @staticmethod def get_directories_to_find_dll(): env_path = os.environ.get("NGS_PY_LIBRARY_PATH", None) if env_path is None: return ( os.path.join(os.path.expanduser('~'), ".ncbi", "lib"+str(process_bits())), "", ".", tempfile.gettempdir(), ) else: return ( env_path, ) @staticmethod def get_post_os_name_param(): if (platform.system() == "Darwin"): return "Mac" else: return platform.system() @staticmethod def get_lib_extension(): if platform.system() == "Windows": return ".dll" elif platform.system() == "Darwin": return ".dylib" elif platform.system() == "Linux": return ".so" else: return "" def initialize_ngs_bindings(self): if self.c_lib_engine and self.c_lib_sdk: # already initialized return # check versions - must be run in a separate script to free library before overwriting it # check_vers_res = os.system('python -c "from ngs import NGS; exit(NGS.checkLibVersions())"') # if platform.system() != "Windows": # check_vers_res = check_vers_res >> 8 # python is a cross-platform language # os.system is not that reliable and cross-platform as subprocess. So using subprocess check_vers_res = subprocess.call([sys.executable, "-c", "from ngs import NGS; exit(NGS.checkLibVersions())"]) do_update_engine = check_vers_res & 1 do_update_sdk = check_vers_res & 2 libname_engine = "ncbi-vdb" libname_sdk = "ngs-sdk" self.c_lib_engine = load_library(libname_engine, do_update_engine, silent=False) self.c_lib_sdk = load_library(libname_sdk, do_update_sdk, silent=False) ############## ngs-engine imports below #################### self._bind(self.c_lib_engine, "PY_NGS_Engine_ReadCollectionMake", [c_char_p, POINTER(c_void_p), POINTER(c_char), c_size_t], None) self._bind(self.c_lib_engine, "PY_NGS_Engine_ReferenceSequenceMake", [c_char_p, POINTER(c_void_p), POINTER(c_char), c_size_t], None) self._bind(self.c_lib_engine, "PY_NGS_Engine_SetAppVersionString", [c_char_p, POINTER(c_char), c_size_t], None) self._bind(self.c_lib_engine, "PY_NGS_Engine_GetVersion", [POINTER(c_char_p), POINTER(c_char), c_size_t], None) self._bind(self.c_lib_engine, "PY_NGS_Engine_IsValid", [c_char_p, POINTER(c_int), POINTER(c_char), c_size_t], None) # self._bind(self.c_lib_engine, "PY_NGS_Engine_RefcountRelease", [c_void_p, POINTER(c_char), c_size_t], None) # self._bind(self.c_lib_engine, "PY_NGS_Engine_StringData", [c_void_p, POINTER(c_char_p)], None) # self._bind(self.c_lib_engine, "PY_NGS_Engine_StringSize", [c_void_p, POINTER(c_size_t)], None) ############## ngs-sdk imports below #################### # Common self._bind(self.c_lib_sdk, "PY_NGS_StringGetData", [c_void_p, POINTER(c_char_p)], None) self._bind(self.c_lib_sdk, "PY_NGS_StringGetSize", [c_void_p, POINTER(c_size_t)], None) self._bind(self.c_lib_sdk, "PY_NGS_RawStringRelease", [c_void_p, POINTER(c_void_p)], None) self._bind(self.c_lib_sdk, "PY_NGS_RefcountRelease", [c_void_p, POINTER(c_void_p)], None) # ReadCollection self.bind_sdk("PY_NGS_ReadCollectionGetName", [c_void_p, POINTER(c_void_p), POINTER(c_void_p)]) self.bind_sdk("PY_NGS_ReadCollectionGetReadGroups", [c_void_p, POINTER(c_void_p), POINTER(c_void_p)]) self.bind_sdk("PY_NGS_ReadCollectionHasReadGroup", [c_void_p, c_char_p, POINTER(c_int), POINTER(c_void_p)]) self.bind_sdk("PY_NGS_ReadCollectionGetReadGroup", [c_void_p, c_char_p, POINTER(c_void_p), POINTER(c_void_p)]) self.bind_sdk("PY_NGS_ReadCollectionGetReferences", [c_void_p, POINTER(c_void_p), POINTER(c_void_p)]) self.bind_sdk("PY_NGS_ReadCollectionHasReference", [c_void_p, c_char_p, POINTER(c_int), POINTER(c_void_p)]) self.bind_sdk("PY_NGS_ReadCollectionGetReference", [c_void_p, c_char_p, POINTER(c_void_p), POINTER(c_void_p)]) self.bind_sdk("PY_NGS_ReadCollectionGetAlignment", [c_void_p, c_char_p, POINTER(c_void_p), POINTER(c_void_p)]) self.bind_sdk("PY_NGS_ReadCollectionGetAlignments", [c_void_p, c_uint32, POINTER(c_void_p), POINTER(c_void_p)]) self.bind_sdk("PY_NGS_ReadCollectionGetAlignmentCount", [c_void_p, c_uint32, POINTER(c_uint64), POINTER(c_void_p)]) self.bind_sdk("PY_NGS_ReadCollectionGetAlignmentRange", [c_void_p, c_uint64, c_uint64, c_uint32, POINTER(c_void_p), POINTER(c_void_p)]) self.bind_sdk("PY_NGS_ReadCollectionGetRead", [c_void_p, c_char_p, POINTER(c_void_p), POINTER(c_void_p)]) self.bind_sdk("PY_NGS_ReadCollectionGetReads", [c_void_p, c_uint32, POINTER(c_void_p), POINTER(c_void_p)]) self.bind_sdk("PY_NGS_ReadCollectionGetReadCount", [c_void_p, c_uint32, POINTER(c_uint64), POINTER(c_void_p)]) self.bind_sdk("PY_NGS_ReadCollectionGetReadRange", [c_void_p, c_uint64, c_uint64, c_uint32, POINTER(c_void_p), POINTER(c_void_p)]) # Alignment self.bind_sdk("PY_NGS_AlignmentGetAlignmentId", [c_void_p, POINTER(c_void_p), POINTER(c_void_p)]) self.bind_sdk("PY_NGS_AlignmentGetReferenceSpec", [c_void_p, POINTER(c_void_p), POINTER(c_void_p)]) self.bind_sdk("PY_NGS_AlignmentGetMappingQuality", [c_void_p, POINTER(c_int32), POINTER(c_void_p)]) self.bind_sdk("PY_NGS_AlignmentGetReferenceBases", [c_void_p, POINTER(c_void_p), POINTER(c_void_p)]) self.bind_sdk("PY_NGS_AlignmentGetReadGroup", [c_void_p, POINTER(c_void_p), POINTER(c_void_p)]) self.bind_sdk("PY_NGS_AlignmentGetReadId", [c_void_p, POINTER(c_void_p), POINTER(c_void_p)]) self.bind_sdk("PY_NGS_AlignmentGetClippedFragmentBases", [c_void_p, POINTER(c_void_p), POINTER(c_void_p)]) self.bind_sdk("PY_NGS_AlignmentGetClippedFragmentQualities", [c_void_p, POINTER(c_void_p), POINTER(c_void_p)]) self.bind_sdk("PY_NGS_AlignmentGetAlignedFragmentBases", [c_void_p, POINTER(c_void_p), POINTER(c_void_p)]) self.bind_sdk("PY_NGS_AlignmentGetAlignmentCategory", [c_void_p, POINTER(c_uint32), POINTER(c_void_p)]) self.bind_sdk("PY_NGS_AlignmentGetAlignmentPosition", [c_void_p, POINTER(c_int64), POINTER(c_void_p)]) self.bind_sdk("PY_NGS_AlignmentGetAlignmentLength", [c_void_p, POINTER(c_uint64), POINTER(c_void_p)]) self.bind_sdk("PY_NGS_AlignmentGetIsReversedOrientation", [c_void_p, POINTER(c_int), POINTER(c_void_p)]) self.bind_sdk("PY_NGS_AlignmentGetSoftClip", [c_void_p, c_uint32, POINTER(c_int32), POINTER(c_void_p)]) self.bind_sdk("PY_NGS_AlignmentGetTemplateLength", [c_void_p, POINTER(c_uint64), POINTER(c_void_p)]) self.bind_sdk("PY_NGS_AlignmentGetShortCigar", [c_void_p, c_int, POINTER(c_void_p), POINTER(c_void_p)]) self.bind_sdk("PY_NGS_AlignmentGetLongCigar", [c_void_p, c_int, POINTER(c_void_p), POINTER(c_void_p)]) self.bind_sdk("PY_NGS_AlignmentGetRNAOrientation", [c_void_p, POINTER(c_char), POINTER(c_void_p)]) self.bind_sdk("PY_NGS_AlignmentHasMate", [c_void_p, POINTER(c_int), POINTER(c_void_p)]) self.bind_sdk("PY_NGS_AlignmentGetMateAlignmentId", [c_void_p, POINTER(c_void_p), POINTER(c_void_p)]) self.bind_sdk("PY_NGS_AlignmentGetMateAlignment", [c_void_p, POINTER(c_void_p), POINTER(c_void_p)]) self.bind_sdk("PY_NGS_AlignmentGetMateReferenceSpec", [c_void_p, POINTER(c_void_p), POINTER(c_void_p)]) self.bind_sdk("PY_NGS_AlignmentGetMateIsReversedOrientation", [c_void_p, POINTER(c_int), POINTER(c_void_p)]) self.bind_sdk("PY_NGS_AlignmentIteratorNext", [c_void_p, POINTER(c_int), POINTER(c_void_p)]) # Fragment self.bind_sdk("PY_NGS_FragmentGetFragmentId", [c_void_p, POINTER(c_void_p), POINTER(c_void_p)]) self.bind_sdk("PY_NGS_FragmentGetFragmentBases", [c_void_p, c_uint64, c_uint64, POINTER(c_void_p), POINTER(c_void_p)]) self.bind_sdk("PY_NGS_FragmentGetFragmentQualities", [c_void_p, c_uint64, c_uint64, POINTER(c_void_p), POINTER(c_void_p)]) self.bind_sdk("PY_NGS_FragmentIsPaired", [c_void_p, POINTER(c_int), POINTER(c_void_p)]) self.bind_sdk("PY_NGS_FragmentIsAligned", [c_void_p, POINTER(c_int), POINTER(c_void_p)]) self.bind_sdk("PY_NGS_FragmentIteratorNext", [c_void_p, POINTER(c_int), POINTER(c_void_p)]) # Package self.bind_sdk("PY_NGS_PackageGetPackageVersion", [POINTER(c_void_p), POINTER(c_void_p)]) # PileupEvent self.bind_sdk("PY_NGS_PileupEventGetMappingQuality", [c_void_p, POINTER(c_int32), POINTER(c_void_p)]) self.bind_sdk("PY_NGS_PileupEventGetAlignmentId", [c_void_p, POINTER(c_void_p), POINTER(c_void_p)]) self.bind_sdk("PY_NGS_PileupEventGetAlignmentPosition", [c_void_p, POINTER(c_int64), POINTER(c_void_p)]) self.bind_sdk("PY_NGS_PileupEventGetFirstAlignmentPosition", [c_void_p, POINTER(c_int64), POINTER(c_void_p)]) self.bind_sdk("PY_NGS_PileupEventGetLastAlignmentPosition", [c_void_p, POINTER(c_int64), POINTER(c_void_p)]) self.bind_sdk("PY_NGS_PileupEventGetEventType", [c_void_p, POINTER(c_uint32), POINTER(c_void_p)]) self.bind_sdk("PY_NGS_PileupEventGetAlignmentBase", [c_void_p, POINTER(c_char), POINTER(c_void_p)]) self.bind_sdk("PY_NGS_PileupEventGetAlignmentQuality", [c_void_p, POINTER(c_char), POINTER(c_void_p)]) self.bind_sdk("PY_NGS_PileupEventGetInsertionBases", [c_void_p, POINTER(c_void_p), POINTER(c_void_p)]) self.bind_sdk("PY_NGS_PileupEventGetInsertionQualities", [c_void_p, POINTER(c_void_p), POINTER(c_void_p)]) self.bind_sdk("PY_NGS_PileupEventGetEventRepeatCount", [c_void_p, POINTER(c_uint32), POINTER(c_void_p)]) self.bind_sdk("PY_NGS_PileupEventGetEventIndelType", [c_void_p, POINTER(c_uint32), POINTER(c_void_p)]) self.bind_sdk("PY_NGS_PileupEventIteratorNext", [c_void_p, POINTER(c_int), POINTER(c_void_p)]) self.bind_sdk("PY_NGS_PileupEventIteratorReset", [c_void_p, POINTER(c_int), POINTER(c_void_p)]) # Pileup self.bind_sdk("PY_NGS_PileupGetReferenceSpec", [c_void_p, POINTER(c_void_p), POINTER(c_void_p)]) self.bind_sdk("PY_NGS_PileupGetReferencePosition", [c_void_p, POINTER(c_int64), POINTER(c_void_p)]) self.bind_sdk("PY_NGS_PileupGetReferenceBase", [c_void_p, POINTER(c_char), POINTER(c_void_p)]) self.bind_sdk("PY_NGS_PileupGetPileupDepth", [c_void_p, POINTER(c_uint32), POINTER(c_void_p)]) self.bind_sdk("PY_NGS_PileupIteratorNext", [c_void_p, POINTER(c_int), POINTER(c_void_p)]) # ReadGroup self.bind_sdk("PY_NGS_ReadGroupGetName", [c_void_p, POINTER(c_void_p), POINTER(c_void_p)]) self.bind_sdk("PY_NGS_ReadGroupGetStatistics", [c_void_p, POINTER(c_void_p), POINTER(c_void_p)]) self.bind_sdk("PY_NGS_ReadGroupIteratorNext", [c_void_p, POINTER(c_int), POINTER(c_void_p)]) # Read self.bind_sdk("PY_NGS_ReadGetReadId", [c_void_p, POINTER(c_void_p), POINTER(c_void_p)]) self.bind_sdk("PY_NGS_ReadGetNumFragments", [c_void_p, POINTER(c_uint32), POINTER(c_void_p)]) self.bind_sdk("PY_NGS_ReadFragmentIsAligned",[c_void_p, c_uint32, POINTER(c_int32), POINTER(c_void_p)]) self.bind_sdk("PY_NGS_ReadGetReadCategory", [c_void_p, POINTER(c_uint32), POINTER(c_void_p)]) self.bind_sdk("PY_NGS_ReadGetReadGroup", [c_void_p, POINTER(c_void_p), POINTER(c_void_p)]) self.bind_sdk("PY_NGS_ReadGetReadName", [c_void_p, POINTER(c_void_p), POINTER(c_void_p)]) self.bind_sdk("PY_NGS_ReadGetReadBases", [c_void_p, c_uint64, c_uint64, POINTER(c_void_p), POINTER(c_void_p)]) self.bind_sdk("PY_NGS_ReadGetReadQualities", [c_void_p, c_uint64, c_uint64, POINTER(c_void_p), POINTER(c_void_p)]) self.bind_sdk("PY_NGS_ReadIteratorNext", [c_void_p, POINTER(c_int), POINTER(c_void_p)]) # Reference self.bind_sdk("PY_NGS_ReferenceGetCommonName", [c_void_p, POINTER(c_void_p), POINTER(c_void_p)]) self.bind_sdk("PY_NGS_ReferenceGetCanonicalName", [c_void_p, POINTER(c_void_p), POINTER(c_void_p)]) self.bind_sdk("PY_NGS_ReferenceGetIsCircular", [c_void_p, POINTER(c_int), POINTER(c_void_p)]) self.bind_sdk("PY_NGS_ReferenceGetLength", [c_void_p, POINTER(c_uint64), POINTER(c_void_p)]) self.bind_sdk("PY_NGS_ReferenceGetReferenceBases", [c_void_p, c_uint64, c_uint64, POINTER(c_void_p), POINTER(c_void_p)]) self.bind_sdk("PY_NGS_ReferenceGetReferenceChunk", [c_void_p, c_uint64, c_uint64, POINTER(c_void_p), POINTER(c_void_p)]) self.bind_sdk("PY_NGS_ReferenceGetAlignment", [c_void_p, c_char_p, POINTER(c_void_p), POINTER(c_void_p)]) self.bind_sdk("PY_NGS_ReferenceGetAlignments", [c_void_p, c_uint32, POINTER(c_void_p), POINTER(c_void_p)]) self.bind_sdk("PY_NGS_ReferenceGetAlignmentSlice", [c_void_p, c_int64, c_uint64, c_uint32, POINTER(c_void_p), POINTER(c_void_p)]) self.bind_sdk("PY_NGS_ReferenceGetFilteredAlignmentSlice", [c_void_p, c_int64, c_uint64, c_uint32, c_uint32, c_int32, POINTER(c_void_p), POINTER(c_void_p)]) self.bind_sdk("PY_NGS_ReferenceGetPileups", [c_void_p, c_uint32, POINTER(c_void_p), POINTER(c_void_p)]) self.bind_sdk("PY_NGS_ReferenceGetFilteredPileups", [c_void_p, c_uint32, c_uint32, c_int32, POINTER(c_void_p), POINTER(c_void_p)]) self.bind_sdk("PY_NGS_ReferenceGetPileupSlice", [c_void_p, c_int64, c_uint64, c_uint32, POINTER(c_void_p), POINTER(c_void_p)]) self.bind_sdk("PY_NGS_ReferenceGetFilteredPileupSlice", [c_void_p, c_int64, c_uint64, c_uint32, c_uint32, c_int32, POINTER(c_void_p), POINTER(c_void_p)]) self.bind_sdk("PY_NGS_ReferenceIteratorNext", [c_void_p, POINTER(c_int), POINTER(c_void_p)]) # ReferenceSequence self.bind_sdk("PY_NGS_ReferenceSequenceGetCanonicalName", [c_void_p, POINTER(c_void_p), POINTER(c_void_p)]) self.bind_sdk("PY_NGS_ReferenceSequenceGetIsCircular", [c_void_p, POINTER(c_int), POINTER(c_void_p)]) self.bind_sdk("PY_NGS_ReferenceSequenceGetLength", [c_void_p, POINTER(c_uint64), POINTER(c_void_p)]) self.bind_sdk("PY_NGS_ReferenceSequenceGetReferenceBases", [c_void_p, c_uint64, c_uint64, POINTER(c_void_p), POINTER(c_void_p)]) self.bind_sdk("PY_NGS_ReferenceSequenceGetReferenceChunk", [c_void_p, c_uint64, c_uint64, POINTER(c_void_p), POINTER(c_void_p)]) # Statistics self.bind_sdk("PY_NGS_StatisticsGetValueType", [c_void_p, c_char_p, POINTER(c_uint32), POINTER(c_void_p)]) self.bind_sdk("PY_NGS_StatisticsGetAsString", [c_void_p, c_char_p, POINTER(c_void_p), POINTER(c_void_p)]) self.bind_sdk("PY_NGS_StatisticsGetAsI64", [c_void_p, c_char_p, POINTER(c_int64), POINTER(c_void_p)]) self.bind_sdk("PY_NGS_StatisticsGetAsU64", [c_void_p, c_char_p, POINTER(c_uint64), POINTER(c_void_p)]) self.bind_sdk("PY_NGS_StatisticsGetAsDouble", [c_void_p, c_char_p, POINTER(c_double), POINTER(c_void_p)]) self.bind_sdk("PY_NGS_StatisticsGetNextPath", [c_void_p, c_char_p, POINTER(c_void_p), POINTER(c_void_p)])
On Quantum Effects in Soft Leptogenesis It has been recently shown that quantum Boltzman equations may be relevant for leptogenesis. Quantum effects, which lead to a time-dependent CP asymmetry, have been shown to be particularly important for resonant leptogenesis when the asymmetry is generated by the decay of two nearly degenerate states. In this work we investigate the impact of the use of quantum Boltzman equations in the framework ``soft leptogenesis'' in which supersymmetry soft-breaking terms give a small mass splitting between the CP-even and CP-odd right-handed sneutrino states of a single generation and provide the CP-violating phase to generate the lepton asymmetry. Introduction The discovery of neutrino oscillations makes leptogenesis a very attractive solution to the baryon asymmetry problem . In the standard type I seesaw framework , the singlet heavy neutrinos have lepton number violating Majorana masses and when decay out of equilibrium produce dynamically a lepton asymmetry which is partially converted into a baryon asymmetry due to fast sphaleron processes. For a hierarchical spectrum of right-handed (RH) neutrinos, successful leptogenesis requires generically quite heavy singlet neutrino masses , of order M > 2.4(0.4) × 10 9 GeV for vanishing (thermal) initial neutrino densities (although flavour effects and/or extended scenarios may affect this limit). Low-energy supersymmetry can be invoked to naturally stabilize the hierarchy between this new scale and the electroweak one. This, however, introduces a certain conflict between the gravitino bound on the reheat temperature and the thermal production of RH neutrinos . A way out of this conflict is provided by resonant leptogenesis . In this scenario RH neutrinos are nearly degenerate in mass which makes the self energy contributions to the CP asymmetries resonantly enhanced and allowing leptogenesis to be possible at much lower temperatures. Once supersymmetry has been introduced, leptogenesis is induced also in singlet sneutrino decays. If supersymmetry is not broken, the order of magnitude of the asymmetry and the basic mechanism are the same as in the non-supersymmetric case. However, as shown in Refs. , supersymmetry-breaking terms can play an important role in the lepton asymmetry generated in sneutrino decays because they induce effects which are essentially different from the neutrino ones. In brief, soft supersymmetry-breaking terms involving the singlet sneutrinos remove the mass degeneracy between the two real sneutrino states of a single neutrino generation, and provide new sources of lepton number and CP violation. As a consequence, the mixing between the sneutrino states generates a CP asymmetry in their decays. At zero temperature and at lowest order in the soft supersymmetry-breaking couplings, the asymmetries generated in the sneutrino decays into fermions and scalars cancel out. However, thermal effects break this cancellation and once they are included the asymmetry can be sizable. In particular it is large for a RH neutrino mass scale relatively low, in the range 10 5 − 10 8 GeV, well below the reheat temperature limits, what solves the cosmological gravitino problem. This scenario has been termed "soft leptogenesis", since the soft terms and not flavour physics provide the necessary mass splitting and CP-violating phase. In general, soft leptogenesis induced by CP violation in mixing as discussed above has the drawback that in order to generate enough asymmetry the lepton-violating soft bilinear coupling, responsible for the sneutrino mass splitting, has to be unconventionally small . In this case, as for the case of resonant leptogenesis, the sneutrino self energy contributions to the CP asymmetries are resonantly enhanced ‡ ‡ Considering the possibility of CP violation also in decay and in the interference of mixing and decay Till recently, the dynamics of thermal leptogenesis (both for the standard see-saw case, as well as for the soft leptogenesis scenario) has been studied using the approach of classical Boltzmann equations (BE). The possibility of using quantum Boltzmann equations (QBE) was first discussed in Ref. and it has been recently derived in detail in Ref. . In Ref. QBE were obtained starting from the non-equilibrium quantum field theory based on the Closed Time-Path formulation. They differ from the classical BE in that they contain integrals over the past times unlike in the classical kinetic theory in which the scattering terms do not include any integral over the past history of the system which is equivalent to assume that any collision in the plasma does not depend upon the previous ones. Quantitatively the most important consequence is that the CP asymmetry acquires an additional time-dependent piece, with its value at a given instant depending upon the previous history of the system. If the time variation of the CP asymmetry is shorter than the relaxation time of the particles abundances, the solutions to the quantum and the classical Boltzmann equations are expected to differ only by terms of the order of the ratio of the time-scale of the CP asymmetry to the relaxation time-scale of the distribution. This is typically the case in thermal leptogenesis with hierarchical RH neutrinos . However, as discussed in Refs. , in the resonant leptogenesis scenario, (M j − M i ) is of the order of the decay rate of the RH neutrinos. As a consequence the typical time-scale to build up coherently the time-dependent CP asymmetry, which is of the order of (M j − M i ) −1 , can be larger than the time-scale for the change of the abundance of the RH neutrinos. This, as shown in Refs. , leads to quantitative differences between the classical and the quantum approach in the case of resonant leptogenesis and, in particular, in the weak washout regime they enhance the produced asymmetry. Motivated by these results and the fact that in soft leptogenesis the CP asymmetry is produced resonantly, we perform a detailed study of the role of quantum effects in the soft leptogenesis scenario. Our results show that because of the thermal nature of soft leptogenesis, the dependence of the quantum effects on the washout regime for soft leptogenesis is quantitatively different than in the see-saw resonant scenario. In particular in the weak washout regime quantum effects do not enhance but suppress the produced baryon asymmetry. Quantum effects are most quantitatively important for extremely degenerate sneutrinos (that is far away from the resonant condition), ∆M ≪ Γ N , and in the strong washout regime they can lead to an enhancement of the produced asymmetry, as well as change of sign, of the produced asymmetry. But altogether, for a given M the required values of the lepton violating soft bilinear term B to achieve successful leptogenesis are not substantially modified. Brief Summary of Soft Leptogenesis: BE and CP asymmetry The supersymmetric see-saw model could be described by the superpotential: where i, j = 1, 2, 3 are flavour indices and N i , L i , H are the chiral superfields for the RH neutrinos, the left-handed (LH) lepton doublets and the Higgs doublets with ǫ αβ = −ǫ βα and ǫ 12 = +1. The corresponding soft breaking terms involving the RH sneutrinosÑ i are given by: wherel T i = ν i ,l − i and h T = (h + , h 0 ) are the slepton and up-type Higgs doublets. As a consequence of the soft breaking B terms, the sneutrino and antisneutrino states mix with mass eigenvectors where Φ ≡ arg(BM) and with mass eigenvalues In what follows, we will consider a single generation of N and N which we label as 1. We also assume proportionality of soft trilinear terms and drop the flavour indices for the coefficients A and B. As discussed in Refs. , in this case, after superfield rotations the Lagrangians (1) and (2) have a unique independent physical CP violating phase, φ = arg(AB * ) which we chose to assign to A. Neglecting supersymmetry breaking effects in the right sneutrino masses and in the vertex, the total singlet sneutrino decay width is given by where v u is the vacuum expectation value of the up-type Higgs doublet, v u = v sin β (v=174 GeV) . As discussed in Ref. , when Γ ≫ ∆M ± ≡ M + − M − , the two singlet sneutrino states are not well-separated particles. In this case, the result for the asymmetry depends on how the initial state is prepared. In what follows we will assume that the sneutrinos are in a thermal bath with a thermalization time Γ −1 shorter than the typical oscillation times, ∆M −1 ± , therefore coherence is lost and it is appropriate to compute the CP asymmetry in terms of the mass eigenstates Eq. (3). In this regime the relevant BE (following in notation and details) including the dominant ∆L = 1, 2 decays and inverse decays as well as the ∆L = 1 scatterings with top quark are: sHz In the equations above, fermionic and scalar lepton asymmetries defined as The equilibrium abundances are given by Y eq c ≡ 15 s is the total number of entropic degrees of freedom, and g * s = 228.75 in the MSSM. In writing Eqs. (6)(7)(8), fast equilibration between the lepton asymmetry in scalars and fermions due to supersymmetry conserving processes has been accounted for. The different γ's are the thermal widths for the following processes: where in all cases a sum over the CP conjugate final states is implicit. The final amount of B−L asymmetry generated by the decay of the singlet sneutrino states assuming no pre-existing asymmetry can be parameterized as: whereǭ is given in Eq. (14) §. η is a dilution factor which takes into account the possible inefficiency in the production of the singlet sneutrinos, the erasure of the generated asymmetry by Lviolating scattering processes and the temperature dependence of the CP asymmetry and it is obtained by solving the array of BE above. After conversion by the sphaleron transitions, the final baryon asymmetry is related to the B − L asymmetry by Without including quantum effects in the BE, the relevant CP asymmetry in Eq. (8) is: where a k ≡ s k , f k with s k =l k h and f k = ℓ kh and we denote by γ the thermal averaged rates. Neglecting supersymmetry breaking in vertices: is the zero temperature CP asymmetry arising from the scalar (or minus the fermion) loop contribution to the N self-energy. The thermal factors are: where are the Bose-Einstein and Fermi-Dirac equilibrium distributions, respectively, and § The factor 2 in Eq. (10) arises from the fact that there are two right-handed sneutrino states while we have defined Y eq N for one degree of freedom. Defined this way, η has the standard normalization η → 1 for perfect out of equilibrium decay. We remind the reader that, as seen above, the relevant CP asymmetry in soft leptogenesis is T (i.e. time) dependent even in the classical regime. This is so because the CP asymmetry is generated by the supersymmetry breaking thermal effects which make the relevant decay CP asymmetries into scalars and fermions different. In the absence of these thermal corrections, no asymmetry is generated. The inclusion of quantum effects (for technical details see Refs. ) introduces an additional time dependence in the CP asymmetry: where Now, we simply have to change the variable from time t of Eq. (21) to a more convenient variable z as we do when writing down the BE as in Eqs. (6 -8). As a reminder to the readers, we will write a few lines illustrating this change. For a universe undergoing adiabatic expansion, the entropy per comoving volume is constant i.e. sR 3 = constant. Since s ∝ z −3 , we have R ∝ z. Then, the Hubble constant is given by H ≡ R −1 dR/dt = z −1 dz/dt. After integration, we get where z 0 is the temperature at t = 0, and H(M) ≡ H(z = 1) = 2/3 g * π 3 /5(M 2 /m pl ) with m pl being the Planck mass. Substituting Eq. (22) into Eq. (21), we obtain where we set z 0 = 0 (i.e. at very high initial temperature). In writing the second equality we have used that M + − M − = B (see Eq. (4)), and we have defined the degeneracy parameter R, and m * = 8v 2 u 3m pl g * π 5 5 ≃ 7.8 × 10 −4 eV. Thus the final CP asymmetry consists of three factors. The first one isǭ in Eq. (14) which is resonantly enhanced for R = 1. The second one is the thermal factor ∆ BF (T ) which is only non-vanishing for z > ∼ 0.8 . The third one is the quantum correction factor, QC(T ) which is composed of two oscillating functions. Next we turn to quantify the impact of this last additional quantum timedependence of the CP asymmetry on the final lepton asymmetry. Before doing so, let us notice that Eq. (8) corresponds to the one-flavour approximation. As discussed in Refs. the one-flavour approximation is rigorously correct only when the interactions mediated by charged lepton Yukawa couplings are out of equilibrium. This is not the case in soft leptogenesis since successful leptogenesis in this scenario requires a relatively low RH neutrino mass scale. Thus the characteristic T is such that the rates of processes mediated by the τ and µ Yukawa couplings are not negligible implying that the effects of lepton flavours have to be taken into account . However, as shown in Refs. quantum effects are flavour independent as long as the damping rates of the leptons are taken to be flavour independent. In this case the QC(T ) factor becomes the one given above (neglecting also the difference in the width between the two sneutrinos) which is the same for all flavours. Furthermore quantum flavour correlations can be safely neglected for soft leptogenesis because M < ∼ 10 −9 GeV and therefore there is no transition between three-to-two or two-toone flavour regimes. So following Ref. , it is straight forward to include flavour in the QBE given above in terms of flavour dependent wash-out factors. For the sake of simplicity in the presentation we restrict here to the one-flavour approximation for the study of the relevance of the quantum effects. Results We show in Fig. I the evolution of the lepton asymmetry with and without the quantum correction factor in the CP asymmetry for several values of the washout factor m ef f and for the resonant case R = 1 and the very degenerate case R = 2 × 10 −4 . The two upper panels correspond to strong and moderate washout regimes, while the lower two correspond to weak and very weak washout regimes. We consider two different initial conditions for the sneutrino abundance. In one case, one assumes that theÑ population is created by their Yukawa interactions with the thermal plasma, and set YÑ (z → 0) = 0. The other case corresponds to an initialÑ abundance equal to the thermal one, YÑ (z → 0) = Y eq N (z → 0). The initial condition on the sneutrino abundances can lead to differences in the weak washout regime. In the strong washout regime the asymmetry generated in the N production phase for YÑ (z → 0) = Y eq N (z → 0) is efficiently washed out (contrary to what happens in the weak washout regime). Consequently, in the strong washout regime the generated asymmetry is independent of the initial conditions. This behaviour is explicitly displayed on the upper panel of First we notice that, as expected, for strong washout and large degeneracy parameter R (see the upper curves in the upper panels), the quantum effects lead to the oscillation of the produced asymmetry till it finally averages out to the classical value. The figure also illustrates that for very small value of R and in the strong washout regime, quantum effects enhance the final asymmetry. For small enough R the arguments in the periodic functions in QC(T ) are very small for all relevant values of z and m ef f . So the sin 2 term in QC(T ) is negligible and expanding the sin term we get which, in the strong washout regime is always larger than 1. Also we see that, independently of the initial conditions, and of the value of the degeneracy parameter, R, the quantum effects always lead to a suppression of the final produced asymmetry in the weak washout regime. This is at difference of what happens in see-saw resonant leptogenesis in which quantum effects lead to an enhancement of the produced asymmetry in weak washout and R ∼ 1 and for zero initial sneutrino abundances . The origin of the difference is the additional time dependence of the asymmetry in soft leptogenesis due to ∆ BF . In order to understand this, we must remember that in see-saw resonant leptogenesis, in the weak washout regime, the final lepton asymmetry results from a cancellation between the anti-asymmetry generated when RH neutrinos are initially produced and the lepton asymmetry produced when they finally decay. When the time-dependent quantum corrections are included, this near-cancellation does not hold or it occurs at earlier times. As a consequence the asymmetry grows larger once these corrections are included as discussed in Ref. . But in soft-leptogenesis, even in the classical regime the thermal factor ∆ BF already prevents the cancellation to occur. Therefore the inclusion of the time dependent quantum effects only amounts to an additional multiplicative factor which, in this regime, is smaller than one. This behaviour is explicitly displayed in Fig. II where we compare the absolute value of the lepton asymmetry with the quantum time dependence of the CP asymmetry and without it in soft leptogenesis with what would be obtained if the thermal factor ∆ BF (z) was not included (so that the CP asymmetry takes a form similar to the one for resonant see-saw). As seen in Fig. II, without the ∆ BF (z) the asymmetry starts being produced at lower z and it changes sign in the classical regime. This change of sign is due to the cancellation between the anti-asymmetry generated when RH neutrinos are initially produced and the lepton asymmetry produced when they finally decay. Inclusion of We notice in passing that for standard see-saw resonant leptogenesis the weak washout regime is physically unreachable as long as flavour effects are not included. This is so because there is a lower bound on the washout parameter once the washout associated to the two quasi-degenerate heavy neutrinos contributes which implies thatm ≥ ∆m 2 solar ∼ 8 × 10 −3 . Such bound does not apply to soft leptogenesis as long as, as assumed in this work, only the lightest sneutrino generation contributes. the QC(T ) factor reduces the asymmetry at small z and this makes the cancellation to occur at lower z and consequently the final asymmetry is larger. In the full calculation (left panel in Fig. II) the asymmetry only starts being nonnegligible for larger z, i.e. z > ∼ 0.8, and it changes sign for z ∼ 1 , both features due to the ∆ BF factor. Inclusion of the quantum correction, QC(T ) amounts for a suppression of the initial asymmetry by a factor given in Eq. (27). As a consequence the final asymmetry is suppressed (and it also has the opposite sign) after including the quantum corrections. A more systematic dependence of the results with the washout and degeneracy parameters, m ef f and R is shown in Fig. III where we plot the efficiency factor η as a function of m ef f and R. We remind the reader that within our approximations for the thermal widths, in the classical regime, η is mostly a function of m ef f exclusively ¶. Inclusion of the quantum correction QC(T ) makes η to depend both on m ef f and R but still remains basically independent of M. From the figure we see that for small enough values of the product of the washout parameter and the degeneracy parameter the arguments of the periodic functions in QC(T ) are always small in the range of z where the lepton asymmetry is generated. As explained above, in this regime the sin 2 term in QC(T ) is negligible while the sin term is multiplied by an amplitude proportional to 1/R. Therefore, the dependence on R cancels ¶ There is a residual dependence on M due to the running of the top Yukawa coupling as well as the thermal effects included in ∆ BF although it is very mild. For tan β ∼ O(1) there is also an additional (very weak) dependence due to the associated change in the top Yukawa coupling. in this limit and the resulting correction is given in Eq. (27). This explains the plateaux observed at low values of the degeneracy parameter R in the lower panels of Fig. III. Similar behaviour is found in Ref. for the resonant leptogenesis scenario. Also, as seen in Eq. (27), the correction grows with m ef f which leads to the considerable enhancement of the efficiency seen in the upper curves of the lower panel in Fig. III. However we must notice that this enhancement occurs in a regime where the CP asymmetry is very small due to the small value of R sinceǭ is proportional to R. Finally, in Fig. IV we compare the range of parameters B and m ef f for which enough asymmetry is generated, Y B ≥ 8.54 × 10 −11 with and without inclusion of the quantum corrections. We show the ranges for several values of M and for the characteristic value of |ImA| = 1 TeV. From the figure we see that due to the suppression of the asymmetry for the weak washout regime discussed above, for a given value of M the regions extend only up to larger values of m ef f once the quantum corrections are included. Also, because of the enhancement in the very degenerate, strong washout regime, the regions tend to extend to lower values of B and larger values of m ef f for a given value of M. Furthermore, once quantum effects are included, η can take both signs (depending on the value of m ef f ), independently of the initialÑ abundance. Thus it is possible to generate the right sign asymmetry with either sign of ImA for both thermal and zero initialÑ abundance. On the contrary without quantum corrections, for thermal initial conditions η > 0 and the right asymmetry can only be generated for ImA > 0. Summary In this article we have performed a detailed study of the role of quantum effects in the soft leptogenesis scenario. We have studied the effects on the produced asymmetry as a function of the washout parameter m ef f and the degeneracy parameter R = 2∆M/Γ N Our results show that, because of the thermal nature of soft supersymmetry, the characteristic time for the building of the asymmetry is larger than in the see-saw resonant leptogenesis which leads to quantitative differences on the dependence of the effect on the washout regime between the two scenarios. In particular, in the weak washout regime, quantum effects do not enhance but suppress the produced lepton asymmetry in soft leptogenesis. Quantum effects are most quantitatively important for extremely degenerate sneutrinos ∆M ≪ Γ N . In this case and in the strong washout regime quantum effects can enhance the absolute value of the produced asymmetry as well as induce a change of its sign. But altogether, our results show that the required values of the Majorana mass M and the lepton violating soft bilinear coefficient B to achieve successful leptogenesis are not substantially modified. The upper (lower) panels correspond to vanishing (thermal) initialÑ abundance.
#include<stdio.h> int main() { int a; int i, n, num1 = 0, num2 = 0, num3 = 0, num4 = 0,sum=0; while (scanf("%d", &n) != EOF) { while (n--) { scanf("%d", &a); if (a == 1) num1 += 1; if (a == 2) num2 += 1; if (a == 3) num3 += 1; if (a == 4) num4 += 1; } sum = num4; if (num3 >= num1) sum += num3 + num2 / 2 + num2 % 2; if (num3 < num1) { if (num2 % 2 == 0 && (num1 - num3) % 4 == 0) sum += num3 + num2 / 2 + (num1 - num3) / 4; if (num2 % 2 == 0 && (num1 - num3) % 4 != 0) sum += num3 + num2 / 2 + (num1 - num3) / 4 + 1; if (num2 % 2 != 0 && (num1 - num3 - 2) >= 0 && (num1 - num3 - 2) % 4 != 0) sum += num3 + num2 / 2 + 1 + (num1 - num3 - 2) / 4 + 1; if (num2 % 2 != 0 && (num1 - num3 - 2) >= 0 && (num1 - num3 - 2) % 4 == 0) sum += num3 + num2 / 2 + 1 + (num1 - num3 - 2) / 4; if (num2 % 2 != 0 && (num1 - num3 - 2) < 0) sum += num3 + num2 / 2 + 1; } printf("%d", sum); num1 = 0; num2 = 0; num3 = 0; num4 = 0; } return 0; }
async def _auth_provider_from_config(hass, store, config): provider_name = config[CONF_TYPE] module = await load_auth_provider_module(hass, provider_name) if module is None: return None try: config = module.CONFIG_SCHEMA(config) except vol.Invalid as err: _LOGGER.error('Invalid configuration for auth provider %s: %s', provider_name, humanize_error(config, err)) return None return AUTH_PROVIDERS[provider_name](hass, store, config)
module Day12 ( day12 ,day12b ) where import Data.Maybe (fromJust) import Data.List (foldl',find,partition) import qualified Data.Set as Set day12 :: String -> Int day12 input = fromJust $ fmap length $ find (Set.member "0") $ findConnected $ parseTree input findConnected :: (Foldable t,Ord a) => t(a,[a]) -> [Set.Set a] findConnected theSet = foldl' connect [] theSet parseTree :: String -> [(String,[String])] parseTree input = map parseLine $ lines input parseLine :: String -> (String,[String]) parseLine input = (\list -> (head list, (tail $ tail list)) ) $ map trimTrailingComma $words input trimTrailingComma :: String -> String trimTrailingComma text = if (last text==',') then init text else text connect :: Ord a => [Set.Set a] -> (a, [a]) -> [Set.Set a] connect sets (key, children) = Set.unions (theSet : connected) : isolated where theSet = Set.fromList (key : children) (isolated, connected) = partition (Set.null . Set.intersection theSet) sets day12b :: String -> Int day12b input = length . foldl' connect [] $ parseTree input
/** * @author <a href="mailto:[email protected]">Bill Burke</a> * @version $Revision: 1 $ */ public class IDToken extends JsonWebToken { public static final String NONCE = "nonce"; public static final String AUTH_TIME = "auth_time"; public static final String SESSION_STATE = "session_state"; public static final String AT_HASH = "at_hash"; public static final String C_HASH = "c_hash"; public static final String NAME = "name"; public static final String GIVEN_NAME = "given_name"; public static final String FAMILY_NAME = "family_name"; public static final String MIDDLE_NAME = "middle_name"; public static final String NICKNAME = "nickname"; public static final String PREFERRED_USERNAME = "preferred_username"; public static final String PROFILE = "profile"; public static final String PICTURE = "picture"; public static final String WEBSITE = "website"; public static final String EMAIL = "email"; public static final String EMAIL_VERIFIED = "email_verified"; public static final String GENDER = "gender"; public static final String BIRTHDATE = "birthdate"; public static final String ZONEINFO = "zoneinfo"; public static final String LOCALE = "locale"; public static final String PHONE_NUMBER = "phone_number"; public static final String PHONE_NUMBER_VERIFIED = "phone_number_verified"; public static final String ADDRESS = "address"; public static final String UPDATED_AT = "updated_at"; public static final String CLAIMS_LOCALES = "claims_locales"; public static final String ACR = "acr"; // Financial API - Part 2: Read and Write API Security Profile // http://openid.net/specs/openid-financial-api-part-2.html#authorization-server public static final String S_HASH = "s_hash"; // NOTE!!! WE used to use @JsonUnwrapped on a UserClaimSet object. This screws up otherClaims and the won't work // anymore. So don't have any @JsonUnwrapped! @JsonProperty(NONCE) protected String nonce; @JsonProperty(AUTH_TIME) protected int authTime; @JsonProperty(SESSION_STATE) protected String sessionState; @JsonProperty(AT_HASH) protected String accessTokenHash; @JsonProperty(C_HASH) protected String codeHash; @JsonProperty(NAME) protected String name; @JsonProperty(GIVEN_NAME) protected String givenName; @JsonProperty(FAMILY_NAME) protected String familyName; @JsonProperty(MIDDLE_NAME) protected String middleName; @JsonProperty(NICKNAME) protected String nickName; @JsonProperty(PREFERRED_USERNAME) protected String preferredUsername; @JsonProperty(PROFILE) protected String profile; @JsonProperty(PICTURE) protected String picture; @JsonProperty(WEBSITE) protected String website; @JsonProperty(EMAIL) protected String email; @JsonProperty(EMAIL_VERIFIED) protected Boolean emailVerified; @JsonProperty(GENDER) protected String gender; @JsonProperty(BIRTHDATE) protected String birthdate; @JsonProperty(ZONEINFO) protected String zoneinfo; @JsonProperty(LOCALE) protected String locale; @JsonProperty(PHONE_NUMBER) protected String phoneNumber; @JsonProperty(PHONE_NUMBER_VERIFIED) protected Boolean phoneNumberVerified; @JsonProperty(ADDRESS) protected AddressClaimSet address; @JsonProperty(UPDATED_AT) protected Long updatedAt; @JsonProperty(CLAIMS_LOCALES) protected String claimsLocales; @JsonProperty(ACR) protected String acr; // Financial API - Part 2: Read and Write API Security Profile // http://openid.net/specs/openid-financial-api-part-2.html#authorization-server @JsonProperty(S_HASH) protected String stateHash; public String getNonce() { return nonce; } public void setNonce(String nonce) { this.nonce = nonce; } public int getAuthTime() { return authTime; } public void setAuthTime(int authTime) { this.authTime = authTime; } public String getSessionState() { return sessionState; } public void setSessionState(String sessionState) { this.sessionState = sessionState; } public String getAccessTokenHash() { return accessTokenHash; } public void setAccessTokenHash(String accessTokenHash) { this.accessTokenHash = accessTokenHash; } public String getCodeHash() { return codeHash; } public void setCodeHash(String codeHash) { this.codeHash = codeHash; } public String getName() { return this.name; } public void setName(String name) { this.name = name; } public String getGivenName() { return this.givenName; } public void setGivenName(String givenName) { this.givenName = givenName; } public String getFamilyName() { return this.familyName; } public void setFamilyName(String familyName) { this.familyName = familyName; } public String getMiddleName() { return this.middleName; } public void setMiddleName(String middleName) { this.middleName = middleName; } public String getNickName() { return this.nickName; } public void setNickName(String nickName) { this.nickName = nickName; } public String getPreferredUsername() { return this.preferredUsername; } public void setPreferredUsername(String preferredUsername) { this.preferredUsername = preferredUsername; } public String getProfile() { return this.profile; } public void setProfile(String profile) { this.profile = profile; } public String getPicture() { return this.picture; } public void setPicture(String picture) { this.picture = picture; } public String getWebsite() { return this.website; } public void setWebsite(String website) { this.website = website; } public String getEmail() { return this.email; } public void setEmail(String email) { this.email = email; } public Boolean getEmailVerified() { return this.emailVerified; } public void setEmailVerified(Boolean emailVerified) { this.emailVerified = emailVerified; } public String getGender() { return this.gender; } public void setGender(String gender) { this.gender = gender; } public String getBirthdate() { return this.birthdate; } public void setBirthdate(String birthdate) { this.birthdate = birthdate; } public String getZoneinfo() { return this.zoneinfo; } public void setZoneinfo(String zoneinfo) { this.zoneinfo = zoneinfo; } public String getLocale() { return this.locale; } public void setLocale(String locale) { this.locale = locale; } public String getPhoneNumber() { return this.phoneNumber; } public void setPhoneNumber(String phoneNumber) { this.phoneNumber = phoneNumber; } public Boolean getPhoneNumberVerified() { return this.phoneNumberVerified; } public void setPhoneNumberVerified(Boolean phoneNumberVerified) { this.phoneNumberVerified = phoneNumberVerified; } public AddressClaimSet getAddress() { return address; } public void setAddress(AddressClaimSet address) { this.address = address; } public Long getUpdatedAt() { return this.updatedAt; } public void setUpdatedAt(Long updatedAt) { this.updatedAt = updatedAt; } public String getClaimsLocales() { return this.claimsLocales; } public void setClaimsLocales(String claimsLocales) { this.claimsLocales = claimsLocales; } public String getAcr() { return acr; } public void setAcr(String acr) { this.acr = acr; } // Financial API - Part 2: Read and Write API Security Profile // http://openid.net/specs/openid-financial-api-part-2.html#authorization-server public String getStateHash() { return stateHash; } public void setStateHash(String stateHash) { this.stateHash = stateHash; } }
import { NgModule } from '@angular/core'; import { RouterModule, Routes, PreloadAllModules } from '@angular/router'; import { CanDeactivate } from '@angular/router'; import { RouletteComponent } from './roulette.component'; import { DepositComponent } from './deposit.component'; import { WithdrawComponent } from './withdraw.component'; import { LoginComponent } from './login.component'; import { ProvablyFairComponent } from './provably-fair.component'; const routes: Routes = [ { path: 'deposit', component: DepositComponent }, { path: 'withdraw', component: WithdrawComponent}, { path: 'login/:id', component: LoginComponent }, { path: 'provably-fair', component: ProvablyFairComponent}, { path: '**', component: RouletteComponent} ]; @NgModule({ imports: [ RouterModule.forRoot(routes, { preloadingStrategy: PreloadAllModules}) ], exports: [ RouterModule ] }) export class AppRoutingModule {}
import math nums = list(map(int, input().split(" "))) a = nums[0] b = nums[1] x = nums[2] bottle = a * a * b if bottle / 2.0 < x: temp = bottle - x y = temp * 2 / a / a result = math.degrees(math.atan(y/a)) else: y = x * 2 / b / a result = math.degrees(math.atan(b/y)) print(result)
/** * * @author Hai * class dinh nghia cac dang table co trong phan mem */ public class ClassTableModel { // bang cho main frame public DefaultTableModel setTableNhanKhau(List<NhanKhauModel> listItem, String[] listColumn) { final int columns = listColumn.length; DefaultTableModel dtm = new DefaultTableModel() { @Override public boolean isCellEditable(int row, int column) { return super.isCellEditable(row, column); //To change body of generated methods, choose Tools | Templates. } @Override public Class<?> getColumnClass(int columnIndex) { return columnIndex == 5 ? Boolean.class : String.class; } }; dtm.setColumnIdentifiers(listColumn); Object[] obj; obj = new Object[columns]; listItem.forEach((NhanKhauModel item) -> { obj[0] = item.getID(); obj[1] = item.getHoTen(); obj[2] = item.getNamSinh(); obj[3] = item.getGioiTinh(); obj[4] = item.getDiaChiHienNay(); dtm.addRow(obj); }); return dtm; } // table cho tieusu public DefaultTableModel setTableTieuSu(List<TieuSuModel> tieuSu, String[] listColumn) { final int column = listColumn.length; DefaultTableModel dtm = new DefaultTableModel() { @Override public boolean isCellEditable(int row, int column) { return super.isCellEditable(row, column); //To change body of generated methods, choose Tools | Templates. } @Override public Class<?> getColumnClass(int columnIndex) { return columnIndex == 5 ? Boolean.class : String.class; } }; dtm.setColumnIdentifiers(listColumn); Object[] obj; obj = new Object[column]; tieuSu.forEach((TieuSuModel item) -> { obj[0] = item.getTuNgay().toString(); obj[1] = item.getDenNgay().toString(); obj[2] = item.getDiaChi(); obj[3] = item.getNgheNghiep(); obj[4] = item.getNoiLamViec(); dtm.addRow(obj); }); dtm.addRow(new Object[] {"", "", "", "", ""}); // dtm.addTableModelListener(new TableModelListener() { // @Override // public void tableChanged(TableModelEvent e) { // int a = dtm.getRowCount(); // if ((e.getLastRow() + 1) == dtm.getRowCount()) { // System.out.println(); // } // // } // }); return dtm; } public DefaultTableModel setTableGiaDinh(List<GiaDinhModel> giaDinh, String[] listColumn) { final int column = listColumn.length; DefaultTableModel dtm = new DefaultTableModel() { @Override public boolean isCellEditable(int row, int column) { return super.isCellEditable(row, column); //To change body of generated methods, choose Tools | Templates. } @Override public Class<?> getColumnClass(int columnIndex) { return columnIndex == 6 ? Boolean.class : String.class; } }; dtm.setColumnIdentifiers(listColumn); Object[] obj; obj = new Object[column]; giaDinh.forEach((GiaDinhModel item) -> { obj[0] = item.getHoTen(); obj[1] = item.getNamSinh().toString(); obj[2] = item.getGioiTinh(); obj[3] = item.getQuanHeVoiNhanKhau(); obj[4] = item.getNgheNghiep(); obj[5] = item.getDiaChiHienTai(); dtm.addRow(obj); }); dtm.addRow(new Object[] {"", "", "", "", "", ""}); return dtm; } public DefaultTableModel setTableHoKhau(List<HoKhauBean> listItem, String[] listColumn) { final int columns = listColumn.length; DefaultTableModel dtm = new DefaultTableModel() { @Override public boolean isCellEditable(int row, int column) { return super.isCellEditable(row, column); //To change body of generated methods, choose Tools | Templates. } @Override public Class<?> getColumnClass(int columnIndex) { return columnIndex == 5 ? Boolean.class : String.class; } }; dtm.setColumnIdentifiers(listColumn); Object[] obj; obj = new Object[columns]; listItem.forEach((HoKhauBean item) -> { obj[0] = item.getHoKhauModel().getID(); obj[1] = item.getHoKhauModel().getMaHoKhau(); obj[2] = item.getChuHo().getHoTen(); obj[3] = item.getHoKhauModel().getDiaChi(); obj[4] = item.getHoKhauModel().getNgayLap(); dtm.addRow(obj); }); return dtm; } }
<filename>Application/src/main/java/com/example/android/bluetoothlegatt/UserListAdapter.java<gh_stars>0 package com.example.android.bluetoothlegatt; import android.content.Context; import android.support.v7.widget.RecyclerView; import android.view.LayoutInflater; import android.view.View; import android.view.ViewGroup; import android.widget.TextView; import java.util.List; public class UserListAdapter extends RecyclerView.Adapter<UserListAdapter.UserViewHolder> { private final LayoutInflater mInflater; private List<User> mUsers; // Cached copy of users UserListAdapter(Context context) { mInflater = LayoutInflater.from(context); } @Override public UserViewHolder onCreateViewHolder(ViewGroup parent, int viewType) { View itemView = mInflater.inflate(R.layout.recyclerview_item, parent, false); return new UserViewHolder(itemView); } @Override public void onBindViewHolder(UserViewHolder holder, int position) { // Covers the case of data not being ready yet. if (mUsers != null) { User current = mUsers.get(position); holder.userItemView.setText(current.getUser()); } else holder.userItemView.setText(R.string.no_user); } void setUsers(List<User> users){ mUsers = users; notifyDataSetChanged(); } // getItemCount() is called many times, and when it is first called, // mUsers has not been updated (means initially, it's null, and we can't return null). @Override public int getItemCount() { if (mUsers != null) return mUsers.size(); else return 0; } public User getUserAtPosition (int position) { return mUsers.get(position); } class UserViewHolder extends RecyclerView.ViewHolder { private final TextView userItemView; private UserViewHolder(View itemView) { super(itemView); userItemView = itemView.findViewById(R.id.textView); } } }
#include "trader/kdbp_db.h" #include <iostream> #include <thread> #include <regex> #include <string> #include <chrono> using namespace std; I isRemoteErr(K x) { if (!x) { fprintf(stderr, "Network error: %s\n", strerror(errno)); return 1; } else if (-128 == xt) { fprintf(stderr, "Error message returned : %s\\n", x->s); r0(x); return 1; } return 0; } J Kdbp::castTime(struct tm *x) { return (J)((60 * x->tm_hour + x->tm_min) * 60 + x->tm_sec) * 1000000000; } int Kdbp::insertMultRow(const string &query, const string &table, K rows) { K result; //sync result = k(-this->_kdb, S(query.c_str()), ks((S)table.c_str()), rows, (K)0); //async // result = k(-this->_kdb, query, (K)0); if (isRemoteErr(result)) { return 1; } // result->n - length; return 0; } int Kdbp::insertRow(const string &query, const string &table, K row) { K result; //sync result = k(-this->_kdb, (S)query.c_str(), ks((S)table.c_str()), row, (K)0); //async // result = k(-this->_kdb, query, (K)0); if (isRemoteErr(result)) { return 1; } // result->n - length; return 0; } int Kdbp::deleteRow(const string &query, const string &table, K row) { K result; //sync result = k(-this->_kdb, (S)query.c_str(), ks((S)table.c_str()), row, (K)0); //async // result = k(-this->_kdb, query, (K)0); if (isRemoteErr(result)) { return 1; } // result->n - length; return 0; } int Kdbp::executeQuery(const string &query) { K result; //sync result = k(-this->_kdb, (S)query.c_str(), (K)0); //async // result = k(-this->_kdb, query, (K)0); if (isRemoteErr(result)) { return 1; } // result->n - length; return 0; } K Kdbp::readQuery(const string &query) { K result; //sync result = k(this->_kdb, (S)query.c_str(), (K)0); //async // result = k(-this->_kdb, query, (K)0); if (isRemoteErr(result)) { return (K)0; } // result->n - length; return result; } void Kdbp::fmt_time(char *str, time_t time, int adjusted) { static char buffer[4096]; struct tm *timeinfo = localtime(&time); if (adjusted) timeinfo->tm_hour -= 1; strftime(buffer, sizeof(buffer), str, timeinfo); printf("%s ", buffer); } K Kdbp::printatom(K x) { char *p = new char[100]; switch (xt) { case -1: printf("%db", x->g); break; case -4: printf("0x%02x", x->g); break; case -5: printf("%d", x->h); break; case -6: printf("%d", x->i); break; case -7: printf("%lld", x->j); break; case -8: printf("%.2f", x->e); break; case -9: printf("%.2f", x->f); break; case -10: printf("\"%c\"", x->g); break; case -11: printf("`%s", x->s); break; case -12: strcpy(p, "%Y.%m.%dD%H:%M:%S."); fmt_time(p, ((x->j) / 8.64e13 + 10957) * 8.64e4, 0); break; case -13: printf("%04d.%02d", (x->i) / 12 + 2000, (x->i) % 12 + 1); break; case -14: strcpy(p, "%Y.%m.%d"); fmt_time(p, ((x->i) + 10957) * 8.64e4, 0); break; case -15: strcpy(p, "%Y.%m.%dD%H:%M:%S"); fmt_time(p, ((x->f) + 10957) * 8.64e4, 0); break; case -16: { strcpy(p, "%jD%H:%M:%S"); fmt_time(p, (x->j) / 1000000000, 1); printf(".%09lld", (x->j) % 1000000000); break; } case -17: strcpy(p, "%H:%M"); fmt_time(p, (x->i) * 60, 1); break; case -18: strcpy(p, "%H:%M:%S"); fmt_time(p, x->i, 1); break; case -19: { strcpy(p, "%H:%M:%S"); fmt_time(p, (x->i) / 1000, 1); printf(".%03d", (x->i) % 1000); break; } default: strcpy(p, "notimplemented"); return krr(p); } return (K)0; } #define showatom(c, a, x, i) \ do \ { \ K r = c(a((x))[i]); \ printatom(r); \ r0(r); \ } while (0) K Kdbp::getitem(K x, int index) { char *p = new char[100]; K r; switch (xt) { case 0: r = kK(x)[index]; break; case 1: r = kb(kG((x))[index]); break; case 4: r = kg(kG((x))[index]); break; case 5: r = kh(kH((x))[index]); break; case 6: r = ki(kI((x))[index]); break; case 7: r = kj(kJ((x))[index]); break; case 8: r = ke(kE((x))[index]); break; case 9: r = kf(kF((x))[index]); break; case 10: r = kc(kC((x))[index]); break; case 11: r = ks(kS((x))[index]); break; case 14: r = kd(kI((x))[index]); break; case 15: r = kz(kF((x))[index]); break; default: strcpy(p, "notimplemented"); r = krr(p); } return r; } K Kdbp::printitem(K x, int index) { char *p = new char[100]; switch (xt) { case 0: printq(kK(x)[index]); break; case 1: showatom(kb, kG, x, index); break; case 4: showatom(kg, kG, x, index); break; case 5: showatom(kh, kH, x, index); break; case 6: showatom(ki, kI, x, index); break; case 7: showatom(kj, kJ, x, index); break; case 8: showatom(ke, kE, x, index); break; case 9: showatom(kf, kF, x, index); break; case 10: showatom(kc, kC, x, index); break; case 11: showatom(ks, kS, x, index); break; case 14: showatom(kd, kI, x, index); break; case 15: showatom(kz, kF, x, index); break; default: strcpy(p, "notimplemented"); return krr(p); } return (K)0; } K Kdbp::printlist(K x) { if (x->n == 1) printf(","); for (int i = 0; i < x->n; i++) printitem(x, i); return (K)0; } K Kdbp::printdict(K x) { K keys = kK(x)[0]; K data = kK(x)[1]; for (int row = 0; row < keys->n; row++) { printitem(keys, row); printf("| "); printitem(data, row); if (row < keys->n - 1) printf("\n"); } return (K)0; } K Kdbp::printtable(K x) { K flip = ktd(x); K columns = kK(flip->k)[0]; K rows = kK(flip->k)[1]; int colcount = columns->n; int rowcount = kK(rows)[0]->n; for (int i = 0; i < colcount; i++) printf("%s\t", kS(columns)[i]); printf("\n"); for (int i = 0; i < rowcount; i++) { for (int j = 0; j < colcount; j++) { printitem(kK(rows)[j], i); printf("\t"); } printf("\n"); } return (K)0; } K Kdbp::printq(K x) { K result; char *p = new char[100]; strcpy(p, "notimplemented"); if (xt < 0) result = printatom(x); else if ((xt >= 0) && (xt < 20)) result = printlist(x); else if (xt == 98) result = printtable(x); else if (xt == 99) result = printdict(x); else result = krr(p); printf("\n"); return result; }
from sys import exit # import copy # import numpy as np # from collections import deque N, D = map(int, input().split()) A = [list(map(int, input().split())) for _ in range(N)] cnt = 0 for a in A: if a[0] ** 2 + a[1] ** 2 <= D ** 2: cnt += 1 print(cnt)
// kill terminates a single process by id, sending either TERM or KILL. func kill() cli.Command { const ( idFlagName = "id" killFlagName = "kill" ) return cli.Command{ Name: "kill", Usage: "Terminate a process.", Flags: append(clientFlags(), cli.StringFlag{ Name: joinFlagNames(idFlagName, "i"), Usage: "Specify the ID of the process to kill.", }, cli.BoolFlag{ Name: killFlagName, Usage: "Send SIGKILL (9) rather than SIGTERM (15).", }, ), Before: mergeBeforeFuncs( clientBefore(), func(c *cli.Context) error { if len(c.String(idFlagName)) == 0 { if c.NArg() != 1 { return errors.New("must specify a process ID") } return errors.Wrap(c.Set(idFlagName, c.Args().First()), "problem setting id from positional flags") } return nil }), Action: func(c *cli.Context) error { ctx, cancel := context.WithCancel(context.Background()) defer cancel() sendKill := c.Bool(killFlagName) procID := c.String(idFlagName) return withConnection(ctx, c, func(client remote.Manager) error { proc, err := client.Get(ctx, procID) if err != nil { return errors.WithStack(err) } if sendKill { return errors.WithStack(jasper.Kill(ctx, proc)) } return errors.WithStack(jasper.Terminate(ctx, proc)) }) }, } }
import React, { PropsWithChildren, useCallback, useEffect, useMemo, useState } from "react" import { Toolbar } from "./components/toolbar" import { FooterComponent } from "./components/footer-component" import { Faqpage } from "./pages/faq-page" import "./custom.scss" import { HashRouter as Router, Switch, Route, Redirect } from "react-router-dom" import { ProjectListPage } from "./pages/project-list-page" import ProjectPage from "./pages/project-page" import { useCheckAutenticationLazyQuery, useCheckAutenticationQuery } from "./api" import { Credentials, LoginPage } from "./pages/login-page" import { ApolloClient, ApolloProvider, InMemoryCache } from "@apollo/client" import { LoadingComponent } from "./components/loading-component" import { createUploadLink } from "apollo-upload-client" import { toast, ToastContainer } from "react-toastify" import "react-toastify/dist/ReactToastify.css" import { InfoComponent } from "./components/info-component" function saveCredentials({ username, key, uri }: Credentials, rememberMe: boolean) { if (rememberMe) { localStorage.setItem("username", username) localStorage.setItem("key", key) localStorage.setItem("uri", uri) } else { sessionStorage.setItem("username", username) sessionStorage.setItem("key", key) sessionStorage.setItem("uri", uri) } } function deleteCredentials(): void { localStorage.removeItem("username") localStorage.removeItem("key") localStorage.removeItem("uri") sessionStorage.removeItem("username") sessionStorage.removeItem("key") sessionStorage.removeItem("uri") } function getCredentials(): Credentials | undefined { const username = sessionStorage.getItem("username") ?? localStorage.getItem("username") const key = sessionStorage.getItem("key") ?? localStorage.getItem("key") const uri = sessionStorage.getItem("uri") ?? localStorage.getItem("uri") if (key != null && username != null && uri != null) { return { username, key, uri, } } else { return undefined } } export function ClientApp() { const [credentials, setCredentials] = useState(() => getCredentials()) const client = useMemo( () => credentials == null ? undefined : new ApolloClient({ cache: new InMemoryCache(), link: createUploadLink({ uri: credentials.uri, credentials: "same-origin", headers: { authorization: JSON.stringify({ username: credentials.username, key: credentials.key, }), }, }), }), [credentials] ) if (client == null) { return <App tryLogin={false} setCredentials={setCredentials} /> } return ( <ApolloProvider client={client}> <App tryLogin={credentials != null} setCredentials={setCredentials} /> </ApolloProvider> ) } export function App({ setCredentials, tryLogin, }: { tryLogin: boolean setCredentials: (credentials: Credentials | undefined) => void }) { const logout = useCallback(() => { deleteCredentials() setCredentials(undefined) toast.success("successfully logged out") }, [setCredentials]) return ( <div className="overflow-hidden"> <ToastContainer /> <div className="w-100 bg-secondary p-2 justify-content-end d-flex px-3 align-items-center"> <span className="text-light">Alpha Version</span> <InfoComponent placement="left" color="white" text="The AdapterHub Playground is a research project and not intended for usage as a free web hosting service to run your online business, e-commerce site, or any other website that is primarily directed at either facilitating commercial transactions or providing commercial software as a service (SaaS)." /> </div> <Router> <Switch> <Route path={"/projects/:id"} render={(props) => ( <AuthWrapper tryLogin={tryLogin} setCredentials={setCredentials}> <Toolbar logout={logout} /> <ProjectPage {...props} /> </AuthWrapper> )}></Route> <Route path="/projects"> <AuthWrapper tryLogin={tryLogin} setCredentials={setCredentials}> <Toolbar logout={logout} /> <ProjectListPage /> </AuthWrapper> </Route> <Route path="/faq"> <Toolbar /> <Faqpage /> </Route> <Route path=""> <Redirect to="/projects" /> </Route> </Switch> </Router> <FooterComponent /> </div> ) } export function AuthWrapper({ children, setCredentials, tryLogin, }: PropsWithChildren<{ setCredentials: (credentials: Credentials | undefined) => void tryLogin: boolean }>) { const [checkAuthentication, { data, loading }] = useCheckAutenticationLazyQuery({ onError: (error) => { toast.error(`unable to login (${error.message})`) }, onCompleted: (data) => { if (data.checkAuthentication) { toast.success("successfully logged in") } else { toast.error("unable to login (wrong/expired token)") } }, }) useEffect(() => { if (tryLogin) { checkAuthentication() } }, [tryLogin]) const login = useCallback( (credentials: Credentials, rememberMe: boolean) => { saveCredentials(credentials, rememberMe) setCredentials(credentials) }, [setCredentials] ) if (loading) { return <LoadingComponent>Logging In</LoadingComponent> } else if (data?.checkAuthentication) { return <div>{children}</div> } else { return ( <div> <Toolbar /> <LoginPage login={login} /> </div> ) } }
pub use crate::wasmer_inner::wasmer_engines::{Dylib, Universal}; // Deprecated engines. pub use crate::wasmer_inner::wasmer_engines::{Native, JIT};
/* Reset dynamic em_rx_queue fields back to defaults */ static void em_reset_rx_queue(struct em_rx_queue *rxq) { rxq->rx_tail = 0; rxq->nb_rx_hold = 0; rxq->pkt_first_seg = NULL; rxq->pkt_last_seg = NULL; }
<filename>profiler/src/main/java/com/splunk/opentelemetry/profiler/events/RelevantEvents.java<gh_stars>0 /* * Copyright Splunk Inc. * * Licensed under the Apache License, Version 2.0 (the "License"); * you may not use this file except in compliance with the License. * You may obtain a copy of the License at * * http://www.apache.org/licenses/LICENSE-2.0 * * Unless required by applicable law or agreed to in writing, software * distributed under the License is distributed on an "AS IS" BASIS, * WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. * See the License for the specific language governing permissions and * limitations under the License. */ package com.splunk.opentelemetry.profiler.events; import com.splunk.opentelemetry.profiler.Configuration; import com.splunk.opentelemetry.profiler.TLABProcessor; import com.splunk.opentelemetry.profiler.ThreadDumpProcessor; import io.opentelemetry.instrumentation.api.config.Config; import java.util.Arrays; import java.util.HashSet; import java.util.Set; import jdk.jfr.consumer.RecordedEvent; public class RelevantEvents { private final Set<String> eventNames; private RelevantEvents(Set<String> eventNames) { this.eventNames = eventNames; } public static RelevantEvents create(Config config) { Set<String> eventNames = new HashSet<>(Arrays.asList(ThreadDumpProcessor.EVENT_NAME, ContextAttached.EVENT_NAME)); if (Configuration.getTLABEnabled(config)) { eventNames.add(TLABProcessor.NEW_TLAB_EVENT_NAME); eventNames.add(TLABProcessor.OUTSIDE_TLAB_EVENT_NAME); } return new RelevantEvents(eventNames); } public boolean isRelevant(RecordedEvent event) { return eventNames.contains(event.getEventType().getName()); } }
"""The set of constants in the game. This includes not just ints but also classes like Item, GameType, Action, etc. """ from enum import Enum RENDER_FPS = 15 BOARD_SIZE = 11 NUM_RIGID = 36 NUM_WOOD = 36 NUM_ITEMS = 20 AGENT_VIEW_SIZE = 4 HUMAN_FACTOR = 32 DEFAULT_BLAST_STRENGTH = 2 DEFAULT_BOMB_LIFE = 10 # color for each of the 4 agents AGENT_COLORS = [[231, 76, 60], [46, 139, 87], [65, 105, 225], [238, 130, 238]] # color for each of the items. ITEM_COLORS = [[240, 248, 255], [128, 128, 128], [210, 180, 140], [255, 153, 51], [241, 196, 15], [141, 137, 124]] ITEM_COLORS += [(153, 153, 255), (153, 204, 204), (97, 169, 169), (48, 117, 117)] # If using collapsing boards, the step at which the board starts to collapse. FIRST_COLLAPSE = 500 MAX_STEPS = 800 RADIO_VOCAB_SIZE = 8 RADIO_NUM_WORDS = 2 # Files for images and and fonts RESOURCE_DIR = 'resources/' file_names = [ 'Passage', 'Rigid', 'Wood', 'Bomb', 'Flames', 'Fog', 'ExtraBomb', 'IncrRange', 'Kick', 'AgentDummy', 'Agent0', 'Agent1', 'Agent2', 'Agent3', 'AgentDummy-No-Background', 'Agent0-No-Background', 'Agent1-No-Background', 'Agent2-No-Background', 'Agent3-No-Background', 'X-No-Background' ] IMAGES_DICT = { num: { 'id': num, 'file_name': '%s.png' % file_name, 'name': file_name, 'image': None } for num, file_name in enumerate(file_names) } FONTS_FILE_NAMES = ['Cousine-Regular.ttf'] # Human view board configurations BORDER_SIZE = 20 MARGIN_SIZE = 10 TILE_SIZE = 50 BACKGROUND_COLOR = (41, 39, 51, 255) TILE_COLOR = (248, 221, 82, 255) TEXT_COLOR = (170, 170, 170, 255) # Constants for easier setting of the "Easy" game. BOARD_SIZE_EASY = 11 NUM_RIGID_EASY = 36 NUM_WOOD_EASY = 36 DEFAULT_BOMB_LIFE_EASY = 10 MAX_STEPS_EASY = 800 # NOTE: Should we get rid of can_kick? That's a hard one to use as well... NUM_ITEMS_EASY = int(NUM_WOOD_EASY/2) DEFAULT_BLAST_STRENGTH_EASY = 2 # Constants for easier setting of the "8x8" game. BOARD_SIZE_8 = 8 NUM_RIGID_8 = 20 NUM_WOOD_8 = 12 DEFAULT_BOMB_LIFE_8 = 7 MAX_STEPS_8 = 500 NUM_ITEMS_8 = 12 DEFAULT_BLAST_STRENGTH_8 = 2 # Constants for the Grid with single agent and goal. GRID_BOARD_SIZE = 24 GRID_MAX_STEPS = 200 GRID_NUM_RIGID = 0 # no walls # num --> avg make_board_grid / avg num_inaccessible # for min_path = 30: # 220 --> 1.73 / 5.72, 180 --> 2.36 / 3.42, 160 --> 3.055 / 3.93, # 120 --> 5.55 / 6.55, 150 --> 3.1 / 3.9 # for min_path = 25 # 150 --> 6.6 / 7.2, 180 --> 3.2 / 4.1 GRIDWALLS_NUM_RIGID = 120 # was 180 # some rigid walls TREE_SIZE = 8 TREE_MAX_STEPS = 100 class Item(Enum): """The Items in the game. When picked up: - ExtraBomb increments the agent's ammo by 1. - IncrRange increments the agent's blast strength by 1. - Kick grants the agent the ability to kick items. AgentDummy is used by team games to denote the third enemy and by ffa to denote the teammate. """ Passage = 0 Rigid = 1 Wood = 2 Bomb = 3 Flames = 4 Fog = 5 ExtraBomb = 6 IncrRange = 7 Kick = 8 AgentDummy = 9 Agent0 = 10 Agent1 = 11 Agent2 = 12 Agent3 = 13 Goal = 14 class GridItem(Enum): """The Items for the Grid env.""" Passage = 0 Wall = 1 Goal = 2 Agent = 3 class GameType(Enum): """The Game Types. FFA: 1v1v1v1. Submit an agent; it competes against other submitted agents. Team: 2v2. Submit an agent; it is matched up randomly with another agent and together take on two other similarly matched agents. TeamRadio: 2v2. Submit two agents; they are matched up against two other agents. Each team passes discrete communications to each other. """ FFA = 1 Team = 2 TeamRadio = 3 Grid = 4 Tree = 5 class Action(Enum): Stop = 0 Up = 1 Down = 2 Left = 3 Right = 4 Bomb = 5 class Result(Enum): Win = 0 Loss = 1 Tie = 2 Incomplete = 3 class InvalidAction(Exception): pass
A western Arkansas lawmaker said Tuesday he has left the Republican Party to become an independent. “I believe I can best represent my district and my values as an independent,” Rep. Nate Bell of Mena said in a text message Tuesday morning to Talk Business and Politics. Bell, who is not running for re-election in 2016, said he had no other comments about the party switch, which leaves Republicans with a 63-36-1 margin in the 100-member state House. Republicans also control the state Senate by a 24-11 margin over Democrats. Bell, a poultry and cattle farmer who works with construction consulting and in the bail bond business according to his House biography, was elected in 2010 in District 20. The district covers parts of Montgomery, Polk and Sevier counties. He also serves as chairman of the House State Agencies and Governmental Affairs committee. The committee was busy in the regular session, which wrapped up in April as well as the special session that ended Thursday (May 28). During the special session, Bell opposed a bill that would have changed the state’s primary date from May to March 2016. Bell cited a variety of reasons for his opposition, including increased costs for the primary and that the issue was not thought out. A Senate version of the bill was approved in the legislature and signed into law Friday (May 29) by Gov. Asa Hutchinson. In the regular session, Bell sponsored House Bill 1113. The bill would have sought to eliminate the dual status of a state holiday honoring Dr. Martin Luther King Jr. and Confederate General Robert E. Lee. The bill failed in committee after several attempts. Comments comments
<filename>API-Samples/07-init_uniform_buffer/07-init_uniform_buffer.cpp<gh_stars>1000+ /* * Vulkan Samples * * Copyright (C) 2015-2016 Valve Corporation * Copyright (C) 2015-2016 LunarG, Inc. * * Licensed under the Apache License, Version 2.0 (the "License"); * you may not use this file except in compliance with the License. * You may obtain a copy of the License at * * http://www.apache.org/licenses/LICENSE-2.0 * * Unless required by applicable law or agreed to in writing, software * distributed under the License is distributed on an "AS IS" BASIS, * WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. * See the License for the specific language governing permissions and * limitations under the License. */ /* VULKAN_SAMPLE_SHORT_DESCRIPTION Create Uniform Buffer */ /* This is part of the draw cube progression */ #include <util_init.hpp> #include <assert.h> #include <string.h> #include <cstdlib> int sample_main(int argc, char *argv[]) { VkResult U_ASSERT_ONLY res; bool U_ASSERT_ONLY pass; struct sample_info info = {}; char sample_title[] = "Uniform Buffer Sample"; init_global_layer_properties(info); init_instance(info, sample_title); init_enumerate_device(info); init_queue_family_index(info); init_device(info); init_window_size(info, 64, 64); info.Projection = glm::perspective(glm::radians(45.0f), 1.0f, 0.1f, 100.0f); info.View = glm::lookAt(glm::vec3(-5, 3, -10), // Camera is at (-5,3,-10), in World Space glm::vec3(0, 0, 0), // and looks at the origin glm::vec3(0, -1, 0) // Head is up (set to 0,-1,0 to look upside-down) ); info.Model = glm::mat4(1.0f); // Vulkan clip space has inverted Y and half Z. // clang-format off info.Clip = glm::mat4(1.0f, 0.0f, 0.0f, 0.0f, 0.0f,-1.0f, 0.0f, 0.0f, 0.0f, 0.0f, 0.5f, 0.0f, 0.0f, 0.0f, 0.5f, 1.0f); // clang-format on info.MVP = info.Clip * info.Projection * info.View * info.Model; /* VULKAN_KEY_START */ VkBufferCreateInfo buf_info = {}; buf_info.sType = VK_STRUCTURE_TYPE_BUFFER_CREATE_INFO; buf_info.pNext = NULL; buf_info.usage = VK_BUFFER_USAGE_UNIFORM_BUFFER_BIT; buf_info.size = sizeof(info.MVP); buf_info.queueFamilyIndexCount = 0; buf_info.pQueueFamilyIndices = NULL; buf_info.sharingMode = VK_SHARING_MODE_EXCLUSIVE; buf_info.flags = 0; res = vkCreateBuffer(info.device, &buf_info, NULL, &info.uniform_data.buf); assert(res == VK_SUCCESS); VkMemoryRequirements mem_reqs; vkGetBufferMemoryRequirements(info.device, info.uniform_data.buf, &mem_reqs); VkMemoryAllocateInfo alloc_info = {}; alloc_info.sType = VK_STRUCTURE_TYPE_MEMORY_ALLOCATE_INFO; alloc_info.pNext = NULL; alloc_info.memoryTypeIndex = 0; alloc_info.allocationSize = mem_reqs.size; pass = memory_type_from_properties(info, mem_reqs.memoryTypeBits, VK_MEMORY_PROPERTY_HOST_VISIBLE_BIT | VK_MEMORY_PROPERTY_HOST_COHERENT_BIT, &alloc_info.memoryTypeIndex); assert(pass && "No mappable, coherent memory"); res = vkAllocateMemory(info.device, &alloc_info, NULL, &(info.uniform_data.mem)); assert(res == VK_SUCCESS); uint8_t *pData; res = vkMapMemory(info.device, info.uniform_data.mem, 0, mem_reqs.size, 0, (void **)&pData); assert(res == VK_SUCCESS); memcpy(pData, &info.MVP, sizeof(info.MVP)); vkUnmapMemory(info.device, info.uniform_data.mem); res = vkBindBufferMemory(info.device, info.uniform_data.buf, info.uniform_data.mem, 0); assert(res == VK_SUCCESS); info.uniform_data.buffer_info.buffer = info.uniform_data.buf; info.uniform_data.buffer_info.offset = 0; info.uniform_data.buffer_info.range = sizeof(info.MVP); /* VULKAN_KEY_END */ vkDestroyBuffer(info.device, info.uniform_data.buf, NULL); vkFreeMemory(info.device, info.uniform_data.mem, NULL); destroy_device(info); destroy_instance(info); return 0; }
<gh_stars>0 /* Copyright (c) 2015, Plume Design Inc. All rights reserved. Redistribution and use in source and binary forms, with or without modification, are permitted provided that the following conditions are met: 1. Redistributions of source code must retain the above copyright notice, this list of conditions and the following disclaimer. 2. Redistributions in binary form must reproduce the above copyright notice, this list of conditions and the following disclaimer in the documentation and/or other materials provided with the distribution. 3. Neither the name of the Plume Design Inc. nor the names of its contributors may be used to endorse or promote products derived from this software without specific prior written permission. THIS SOFTWARE IS PROVIDED BY THE COPYRIGHT HOLDERS AND CONTRIBUTORS "AS IS" AND ANY EXPRESS OR IMPLIED WARRANTIES, INCLUDING, BUT NOT LIMITED TO, THE IMPLIED WARRANTIES OF MERCHANTABILITY AND FITNESS FOR A PARTICULAR PURPOSE ARE DISCLAIMED. IN NO EVENT SHALL Plume Design Inc. BE LIABLE FOR ANY DIRECT, INDIRECT, INCIDENTAL, SPECIAL, EXEMPLARY, OR CONSEQUENTIAL DAMAGES (INCLUDING, BUT NOT LIMITED TO, PROCUREMENT OF SUBSTITUTE GOODS OR SERVICES; LOSS OF USE, DATA, OR PROFITS; OR BUSINESS INTERRUPTION) HOWEVER CAUSED AND ON ANY THEORY OF LIABILITY, WHETHER IN CONTRACT, STRICT LIABILITY, OR TORT (INCLUDING NEGLIGENCE OR OTHERWISE) ARISING IN ANY WAY OUT OF THE USE OF THIS SOFTWARE, EVEN IF ADVISED OF THE POSSIBILITY OF SUCH DAMAGE. */ #include <stdio.h> #include <string.h> #include "util.h" #include "log.h" #include "cmd_server.h" /** * "help" command */ CLIENT_HELP(help) = { "[CMD] ; Show help for command CMD", NULL }; CLIENT_CMD(help) { (void)argc; (void)argv; struct client_cmd *cmd; if (argc < 2) { CLI_PRINTF(cli, "\n"); cmd = client_cmd_table; while (cmd->acc_name != NULL) { CLI_PRINTF(cli, "%-16s%s\n", cmd->acc_name, cmd->acc_help->ch_short); cmd++; } CLI_PRINTF(cli, "\n"); } else { cmd = client_cmd_table; while (cmd->acc_name != NULL) { if (strcmp(argv[1], cmd->acc_name) == 0) { if (cmd->acc_help->ch_long == NULL) { CLI_PRINTF(cli, "\nHelp for '%s' not available.\n\n", cmd->acc_name); } else { CLI_PRINTF(cli, "%s - %s\n\n%s\n", cmd->acc_name, cmd->acc_help->ch_short, cmd->acc_help->ch_long); } return true; } cmd++; } CLI_PRINTF(cli, "Help for '%s' not found.\n", argv[1]); } return true; } /** * Show current version */ extern const char *app_build_ver_get(); extern const char *app_build_time_get(); extern const char *app_build_author_get(); CLIENT_HELP(version) = { "show current version", NULL }; CLIENT_CMD(version) { (void)argc; (void)argv; CLI_PRINTF(cli, "%s %s %s\n", app_build_ver_get(), app_build_author_get(), app_build_time_get()); return true; } CLIENT_HELP(base64) = { "Execute a command encoded in base64 (advanced users)", "Receive a base64 encoded string list (separated by the null-character) and execute it as a client command." }; #define BASE64_ARGV_MAX 64 CLIENT_CMD(base64) { char buf[8192]; char *pbuf; ssize_t bufsz; int cmd_argc = 0; char *cmd_argv[BASE64_ARGV_MAX]; if (argc != 2) { CLI_PRINTF(cli, "base64: Invalid number of arguments.\n"); return false; } /* Base64 decode */ bufsz = base64_decode(buf, sizeof(buf), argv[1]); if (bufsz < 0) { CLI_PRINTF(cli, "base64: Error decoding buffer. Too big?\n"); return false; } /* Parse the string list inside the decoded buffer */ cmd_argc = 0; pbuf = buf; while (pbuf < buf + bufsz) { if (pbuf + strlen(pbuf) > buf + bufsz) { CLI_PRINTF(cli, "base64: Format error.\n"); return false; } if (cmd_argc >= BASE64_ARGV_MAX) { CLI_PRINTF(cli, "base64: Too many arguments in buffer.\n"); return false; } cmd_argv[cmd_argc++] = pbuf; /* Move to the next string */ pbuf += strlen(pbuf) + 1; } if (cmd_argc <= 0) { CLI_PRINTF(cli, "base64: No command received.\n"); return false; } return client_exec_argv(cli, cmd_argc, cmd_argv); } /** * "loglevel" command */ CLIENT_HELP(loglevel) = { "[LEVEL] ; Show or Set the current log level", "When invoked without arguments it shows the current logging levels.\n" "\n" "With arguments, it sets the current logging levels. The LEVEL parameter is in the \n" "\"[MODULE:]SEVERITY,...\" format. If the module is omitted, the default log level is\n" "set.\n", }; CLIENT_CMD(loglevel) { int ii; if (false == log_isenabled()) { CLI_PRINTF(cli, "Logger not enabled yet.\n"); return true; } if (argc < 2) { CLI_PRINTF(cli, "\nDefault log level is: %s\n", log_severity_str(log_severity_get())); CLI_PRINTF(cli, "Log levels:"); for (ii = 0; ii < LOG_SEVERITY_LAST; ii++) { CLI_PRINTF(cli, " %s", log_severity_str(ii)); } CLI_PRINTF(cli, "\n"); CLI_PRINTF(cli, "Module log levels:\n"); for (ii = 0; ii <LOG_MODULE_ID_LAST; ii++) { CLI_PRINTF(cli, "\t%8s - %s\n", log_module_str(ii), log_severity_str(log_module_severity_get(ii))); } CLI_PRINTF(cli, "\n"); } else { if (!log_severity_parse(argv[1])) { CLI_PRINTF(cli, "Error: Invalid loglevel string '%s'.\n", argv[1]); } } return true; }
/* * Licensed to the Apache Software Foundation (ASF) under one or more * contributor license agreements. See the NOTICE file distributed with * this work for additional information regarding copyright ownership. * The ASF licenses this file to You under the Apache License, Version 2.0 * (the "License"); you may not use this file except in compliance with * the License. You may obtain a copy of the License at * * http://www.apache.org/licenses/LICENSE-2.0 * * Unless required by applicable law or agreed to in writing, software * distributed under the License is distributed on an "AS IS" BASIS, * WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. * See the License for the specific language governing permissions and * limitations under the License. */ import EventEmitter from 'events' import { ITrace } from '../types' import { checkHessianParam, checkRetValHessian, isBoolean, isFn, isMap, isNumber, isString, msg, Version } from '../util' describe('test util method', () => { it('msg pub/sub', () => { expect(msg).toBeDefined() expect(msg instanceof EventEmitter).toBeTruthy() msg.on('hello', (data) => { expect(data).toEqual({ name: '@apache/dubbo-js' }) }) msg.emit('hello', { name: '@apache/dubbo-js' }) }) it('isBoolean', () => { expect(isBoolean(false)).toBeTruthy() expect(isBoolean(true)).toBeTruthy() expect(isBoolean(1)).toBeFalsy() expect(isBoolean('')).toBeFalsy() expect(isBoolean(undefined)).toBeFalsy() }) it('isString', () => { expect(isString('')).toBeTruthy() expect(isString(true)).toBeFalsy() expect(isString(1)).toBeFalsy() expect(isString(undefined)).toBeFalsy() }) it('isNumber', () => { expect(isNumber(1)).toBeTruthy() expect(isNumber(1.0)).toBeTruthy() expect(isNumber(true)).toBeFalsy() expect(isNumber('')).toBeFalsy() expect(isNumber(undefined)).toBeFalsy() }) it('isMap', () => { const map = new Map() expect(isMap(map)).toBeTruthy() expect(isMap(1.0)).toBeFalsy() expect(isMap(true)).toBeFalsy() expect(isMap('')).toBeFalsy() expect(isMap(undefined)).toBeFalsy() }) it('isFn', () => { expect(isFn(() => {})).toBeTruthy() expect(isFn('')).toBeFalsy() expect(isFn(1.0)).toBeFalsy() expect(isFn(true)).toBeFalsy() expect(isFn('')).toBeFalsy() expect(isFn(undefined)).toBeFalsy() }) it('checkHessianParam', () => { expect(checkHessianParam({ $class: 'java.lang.Long', $: 123 })).toBeTruthy() expect(checkHessianParam('')).toBeFalsy() expect(checkHessianParam(1)).toBeFalsy() expect(checkHessianParam(true)).toBeFalsy() expect(checkHessianParam({})).toBeFalsy() }) it('checkRetValHessian', () => { expect(checkRetValHessian(undefined)).toBeTruthy() expect(checkRetValHessian(true)).toBeTruthy() expect(checkRetValHessian(0)).toBeTruthy() expect(checkRetValHessian('')).toBeTruthy() expect(checkRetValHessian(new Map())).toBeTruthy() expect(checkRetValHessian({})).toBeFalsy() expect( checkRetValHessian({ __fields2java() { return { $class: 'java.lang.String', $: 'hello' } } }) ).toBeTruthy() }) it('trace', () => { msg.on('sys:trace', (data: ITrace) => { expect(data.type).toEqual('INFO') expect(data.msg).toEqual('@apache/dubbo-js') }) }) it('version', () => { expect(Version.isSupportResponseAttachment('2.0.2')).toBeTruthy() expect(Version.isSupportResponseAttachment('2.0.99')).toBeTruthy() expect(Version.isSupportResponseAttachment('2.0.0')).toBeFalsy() expect(Version.isSupportResponseAttachment('2.0.199')).toBeFalsy() expect(Version.getIntVersion('2.0.2')).toEqual(2000200) }) })
On theultimate sensitivity in coordinatemeasurements A high-sensitivitymeterof small mechanicaldisplacementsis a key elementin a numberof ambitiousexperimental programsincludingdifferenttypes of probemassexperiments,development of gravitywave detectors and mechanical QND measurements. For example,in gravity-waveantennas,experimentalistsare confrontedwith the necessityto measuresmalldisplacements AL ~hLof freemasses separatedby thedistanceL dueto a gravity-waveinducedperturbationof the metric h. For the lowest necessary level of sensitivity h = 10~2I, and L=4X 10~cm,the valueofthe displacement should be AL~2x1016 cm. In accordancewith existing astrophysical predictionsfor the intensityof gravitywave signalsfrom extra-terrestrialsources,this displacement should be registered over the time r~10~_i 0~s, andthisrequiresthatthesensitivity of the coordinate meter should be better than LS.Lf~ 6 x1018 cm/.,JI~i A highersensitivityisnecessaryfor experimentsaimedat reachingthe standardquantumlimit in the measurement of the coordinateof the probemassm:
/** * wlan_hdd_cfg80211_set_wiphy_sae_feature() - Indicates support of SAE feature * @wiphy: Pointer to wiphy * @config: pointer to config * * This function is used to indicate the support of SAE * * Return: None */ static void wlan_hdd_cfg80211_set_wiphy_sae_feature(struct wiphy *wiphy, hdd_config_t *config) { if (config->is_sae_enabled) wiphy->features |= NL80211_FEATURE_SAE; }
/** * Integration test for RM-2192 * * @author Tuna Aksoy * @since 2.2.1.1 */ public class RM2192Test extends BaseRMTestCase { private static final String PATH = "/111/222/333"; private RuleService ruleService; private JSONConversionComponent converter; private NodeRef folder; private String user; private NodeRef documentLibrary2; @Override protected void initServices() { super.initServices(); ruleService = (RuleService) applicationContext.getBean("RuleService"); converter = (JSONConversionComponent) applicationContext.getBean("jsonConversionComponent"); } @Override protected boolean isCollaborationSiteTest() { return true; } @Override protected boolean isRecordTest() { return true; } @Override protected boolean isUserTest() { return true; } @Override protected void setupCollaborationSiteTestDataImpl() { super.setupCollaborationSiteTestDataImpl(); String collabSiteId2 = generate(); siteService.createSite("site-dashboard", collabSiteId2, generate(), generate(), PUBLIC); documentLibrary2 = getSiteContainer( collabSiteId2, DOCUMENT_LIBRARY, true, siteService, transactionService, taggingService); assertNotNull("Collaboration site document library component was not successfully created.", documentLibrary2); user = generate(); createPerson(user); siteService.setMembership(collabSiteId2, user, SITE_MANAGER); filePlanRoleService.assignRoleToAuthority(filePlan, ROLE_RECORDS_MANAGER, user); } public void testAccessToRecordAfterDeclaring() { doTestInTransaction(new Test<Void>() { @Override public Void run() { folder = fileFolderService.create(documentLibrary2, generate(), TYPE_FOLDER).getNodeRef(); Action createAction = actionService.createAction(CreateRecordAction.NAME); createAction.setParameterValue(CreateRecordAction.PARAM_FILE_PLAN, filePlan); Rule declareRule = new Rule(); declareRule.setRuleType(INBOUND); declareRule.setTitle(generate()); declareRule.setAction(createAction); declareRule.setExecuteAsynchronously(true); declareRule.applyToChildren(true); ruleService.saveRule(folder, declareRule); Action fileAction = actionService.createAction(FileToAction.NAME); fileAction.setParameterValue(FileToAction.PARAM_PATH, PATH); fileAction.setParameterValue(FileToAction.PARAM_CREATE_RECORD_PATH, true); Rule fileRule = new Rule(); fileRule.setRuleType(INBOUND); fileRule.setTitle(generate()); fileRule.setAction(fileAction); fileRule.setExecuteAsynchronously(true); ruleService.saveRule(unfiledContainer, fileRule); return null; } @Override public void test(Void result) throws Exception { assertFalse(ruleService.getRules(folder).isEmpty()); assertFalse(ruleService.getRules(unfiledContainer).isEmpty()); } }); doTestInTransaction(new Test<Void>() { NodeRef document; @Override public Void run() { document = fileFolderService.create(folder, generate(), TYPE_CONTENT).getNodeRef(); return null; } @Override public void test(Void result) throws InterruptedException { Thread.sleep(10000); assertEquals(permissionService.hasPermission(document, READ_RECORDS), ALLOWED); assertTrue(recordService.isFiled(document)); assertNotNull(converter.toJSON(document, true)); } }, user); } }
Structural basis for nonribosomal peptide synthesis by an aminoacyl-tRNA synthetase paralog Cyclodipeptides are secondary metabolites biosynthesized by many bacteria and exhibit a wide array of biological activities. Recently, a new class of small proteins, named cyclodipeptide synthases (CDPS), which are unrelated to the typical nonribosomal peptide synthetases, was shown to generate several cyclodipeptides, using aminoacyl-tRNAs as substrates. The Mycobacterium tuberculosis CDPS, Rv2275, was found to generate cyclodityrosine through the formation of an aminoacyl-enzyme intermediate and to have a structure and oligomeric state similar to those of the class Ic aminoacyl-tRNA synthetases (aaRSs). However, the poor sequence conservation among CDPSs has raised questions about the architecture and catalytic mechanism of the identified homologs. Here we report the crystal structures of Bacillus licheniformis CDPS YvmC-Blic, in the apo form and complexed with substrate mimics, at 1.7–2.4-Å resolutions. The YvmC-Blic structure also exhibits similarity to the class Ic aaRSs catalytic domain. Our mutational analysis confirmed the importance of a set of residues for cyclodileucine formation among the conserved residues localized in the catalytic pocket. Our biochemical data indicated that YvmC-Blic binds tRNA and generates cyclodileucine as a monomer. We were also able to detect the presence of an aminoacyl-enzyme reaction intermediate, but not a dipeptide tRNA intermediate, whose existence was postulated for Rv2275. Instead, our results support a sequential catalytic mechanism for YvmC-Blic, with the successive attachment of two leucine residues on the enzyme via a conserved serine residue. Altogether, our findings suggest that all CDPS enzymes share a common aaRS-like architecture and a catalytic mechanism involving the formation of an enzyme-bound intermediate.
#include<bits/stdc++.h> using namespace std; #define ll long long int int main() { int t; cin>>t; while(t--) { ll x,y; cin>>x>>y; ll p,q; cin>>p>>q; ll lo=1; ll hi=1000000000; ll ans=-1; while(lo<=hi) { ll mi=(lo+hi)/2; ll r=p*mi; ll s=q*mi; ll a=r-x; ll b=s-y; if(a>=0 && b>=0 && b>=a) { ans=mi; hi=mi-1; } else lo=mi+1; } if(ans==-1) cout<<"-1"<<endl; else { ans=q*ans-y; cout<<ans<<endl; } } return 0; }
They are the poster boys of matrimonial classifieds. They are paid handsomely, perceived to be intelligent and travel abroad frequently. Single-handedly, they brought purpose to the otherwise sleepy city of Bangalore. Indian software engineers are today the face of a third-world rebellion. But what exactly do they do? That’s a disturbing question. Last week, during the annual fair of the software industry’s apex body Nasscom, no one uttered a word about India’s programmers. The event, which brought together software professionals from around the world, used up all its 29 sessions to discuss prospects to improve the performance of software companies. Panels chose to debate extensively on subjects like managing innovation, business growth and multiple geographies. But there was nothing on programmers, who you would imagine are the driving force behind the success of the Indian software companies. Perhaps you imagined wrong. “It is an explosive truth that local software companies won’t accept. Most software professionals in India are not programmers, they are mere coders,” says a senior executive from a global consultancy firm, who has helped Nasscom in researching its industry reports. In industry parlance, coders are akin to smart assembly line workers as opposed to programmers who are plant engineers. Programmers are the brains, the glorious visionaries who create things. Large software programmes that often run into billions of lines are designed and developed by a handful of programmers. Coders follow instructions to write, evaluate and test small components of the large program. As a computer science student in IIT Mumbai puts it if programming requires a post graduate level of knowledge of complex algorithms and programming methods, coding requires only high school knowledge of the subject. Coding is also the grime job. It is repetitive and monotonous. Coders know that. They feel stuck in their jobs. They have fallen into the trap of the software hype and now realise that though their status is glorified in the society, intellectually they are stranded. Companies do not offer them stock options anymore and their salaries are not growing at the spectacular rates at which they did a few years ago. “There is nothing new to learn from the job I am doing in Pune. I could have done it with some training even after passing high school,” says a 25-year-old who joined Infosys after finishing his engineering course in Nagpur. A Microsoft analyst says, “Like our manufacturing industry, the Indian software industry is largely a process driven one. That should speak for the fact that we still don’t have a domestic software product like Yahoo or Google to use in our daily lives.” IIT graduates have consciously shunned India’s best known companies like Infosys and TCS, though they offered very attractive salaries. Last year, from IIT Powai, the top three Indian IT companies got just 10 students out of the 574 who passed out. The best computer science students prefer to join companies like Google and Trilogy. Krishna Prasad from the College of Engineering, Guindy, Chennai, who did not bite Infosys’ offer, says, “The entrance test to join TCS is a joke compared to the one in Trilogy. That speaks of what the Indian firms are looking for.” A senior TCS executive, who requested anonymity, admitted that the perception of coders is changing even within the company. It is a gloomy outlook. He believes it has a lot to do with business dynamics. The executive, a programmer for two decades, says that in the late ’70s and early ’80s, software drew a motley set of professionals from all kinds of fields. In the mid-’90s, as onsite projects increased dramatically, software companies started picking all the engineers they could as the US authorities granted visas only to graduates who had four years of education after high school. “After Y2K, as American companies discovered India’s cheap software professionals, the demand for engineers shot up,” the executive says. Most of these engineers were coders. They were almost identical workers who sat long hours to write line after line of codes, or test a fraction of a programme. They did not complain because their pay and perks were good. Now, the demand for coding has diminished, and there is a churning. Over the years, due to the improved communication networks and increased reliability of Indian firms, projects that required a worker to be at a client’s site, say in America, are dwindling in number. And with it the need for engineers who have four years of education after high school. Graduates from non-professional courses, companies know, can do the engineer’s job equally well. Also, over the years, as Indian companies have already coded for many common applications like banking, insurance and accounting, they have created libraries of code which they reuse. Top software companies have now started recruiting science graduates who will be trained alongside engineers and deployed in the same projects. The CEO of India’s largest software company TCS, S Ramadorai, had earlier explained, “The core programming still requires technical skills. But, there are other jobs we found that can be done by graduates.” NIIT’s Arvind Thakur says, “We have always maintained that it is the aptitude and not qualifications that is vital for programming. In fact, there are cases where graduate programmers have done better than the ones from the engineering stream.” Software engineers, are increasingly getting dejected. Sachin Rao, one of the coders stuck in the routine of a job that does not excite him anymore, has been toying with the idea of moving out of Infosys but cannot find a different kind of “break”, given his coding experience. He sums up his plight by vaguely recollecting a story in which thousands of caterpillars keep climbing a wall, the height of which they don’t know. They clamber over each other, fall, start again, but keep climbing. They don’t know that they can eventually fly. Rao cannot remember how the story ends but feels the coders of India today are like the caterpillars who plod their way through while there are more spectacular ways of reaching the various destinations of life.. Advertisements Like this: Like Loading... Related
#include <stdio.h> int main () { long a[1000],max=0,t[10000],n,j,k=0,test,i; scanf("%ld",&n); for(i=0;i<n;i++) {scanf("%ld",&a[i]); j=0; test=0; if(a[i]>=0) {do{ if((j*j)==a[i]) test=1; j++; }while(((j*j)<=a[i])&&(test==0)); if (test==0) {t[k]=a[i]; k++;}} else {t[k]=a[i]; k++;} }max=t[0]; for(i=1;i<k;i++) if(max<t[i]) {max=t[i];} printf("%ld",max); return 0; }
def _dirs_configure_experiment_dir(self, root, results, experiment): self._experiment_dir="%s/%s/%s" %( root,results,experiment) self._topology_dir = "%s/topology" %self._experiment_dir if not os.path.exists(self._experiment_dir): try: os.makedirs(self._experiment_dir) except os.error: print "Dir %s exists or cannot be created" % self._experiment_dir os.chown("%s/%s"%(root,results), self._uid, self._gid) os.chown(self._experiment_dir, self._uid, self._gid) if not os.path.exists(self._topology_dir): try: os.makedirs(self._topology_dir) except os.error: print "Dir %s exists or cannot be created" % self._topology_dir os.chown(self._topology_dir, self._uid, self._gid)
/** * This problem was asked by Uber. A rule looks like this: A NE B This means this means point A is located northeast of point B. A SW C means that point A is southwest of C. Given a list of rules, check if the sum of the rules validate. For example: A N B B NE C C N A does not validate, since A cannot be both north and south of C. A NW B A N B is considered valid. * @author Andrew * */ public class ValidateCompassRules { public static boolean rulesAreValid(List<String> rules) { return false; } @Test @Ignore public void defaultTest() { List<String> list = Arrays.asList("A N B", "B NE C"," C N A"); assertFalse(rulesAreValid(list)); list = Arrays.asList("A NW B","A N B"); assertTrue(rulesAreValid(list)); } }
def build_grid(key): if not key.isalpha(): raise ValueError key = key.upper().replace("J", "I") alphabet = "ABCDEFGHIKLMNOPQRSTUVWXYZ" key_stream = "".join([key, alphabet]) dim = 5 key_grid = [[""]*dim for _ in range(dim)] grid_i = grid_j = 0 used = set() for k in key_stream: if k not in used: used.add(k) key_grid[grid_i][grid_j] = k grid_j = (grid_j + 1) % dim if grid_j is 0: grid_i += 1 if grid_i is dim: break return key_grid
from rest_framework.decorators import action from rest_framework.schemas import AutoSchema from rest_framework.response import Response from rest_framework import viewsets, status from coreapi import Field, Link, document from django.db.models import Sum import coreschema from apps.summaries.serializers import SummariesSerializer from apps.categorys.serializers import CategorySerializer from apps.bills.serializers import BillSerializer from apps.bills.models import Bills from apps.categorys.models import Categorys from string import Template from datetime import datetime # 折线数据处理 def linechart_handle(time_type, time_stat): time_list = [item['time'] for item in time_stat] if not time_list: return [] max_time = max(time_list) time_format = '%Y-%m' if time_type == 'YEAR' else '%Y-%m-%d' dt = datetime.strptime(max_time, time_format) upper_limit = dt.month if time_type == 'YEAR' else dt.day def init_item(item): day = 1 if time_type == 'YEAR' else item month = item if time_type == 'YEAR' else dt.month year = dt.year return { "time": datetime(day=day, month=month, year=year).strftime(time_format), "total_amount": 0 } init_data = map(init_item, range(1, upper_limit + 1)) init_data = list(init_data) # merge init_data & time_stat for init_item in list(init_data): for stat_item in time_stat: if init_item['time'] == stat_item['time']: init_item['total_amount'] = stat_item['total_amount'] return init_data # 环状数据处理 def ringchart_handle(stat_data): # 根据data获取到类别ID列表 id_list = [item['category'] for item in stat_data] # 使用 user 查出对应的类别映射对象 category_list = Categorys.objects.filter(id__in=id_list).values('id', 'name') category_list = list(category_list) # 计算类别金额总和 stat_sum = sum(item['amount_sum'] for item in stat_data) for stat_item in stat_data: # 替换类别名称 for category_item in category_list: if stat_item['category'] == category_item['id']: stat_item['category'] = category_item['name'] rate = stat_item['amount_sum'] / stat_sum # 百分比:四舍五入小数点两位 stat_item['rate'] = float("{0:.2f}".format(round(rate,2))) return stat_data class CustomAutoSchema(AutoSchema): def get_link(self, path, method, base_url): link = super().get_link(path, method, base_url) fields = [ Field( 'time_type', location='query', required=True, schema=coreschema.Enum(enum=['year', 'month'])), Field( 'time_value', location='query', required=True, schema=coreschema.String()), ] fields = tuple(fields) link = Link( url=link.url, action=link.action, encoding=link.encoding, fields=fields, description=link.description) document.Link() return link class SummariesViewSet(viewsets.GenericViewSet): schema = CustomAutoSchema() def get_queryset(self): serializer = SummariesSerializer(data=self.request.query_params) serializer.is_valid(raise_exception=True) time_type = self.request.query_params.get('time_type') time_value = self.request.query_params.get('time_value') if time_type == 'YEAR': return Bills.objects.filter( record_date__year=time_value, user=self.request.user) if time_type == 'MONTH': time_value = time_value.split('-', 1) return Bills.objects.filter( record_date__year=time_value[0], record_date__month=time_value[1], user=self.request.user) @action(methods=['GET'], detail=False) def info(self, request): """ 概要信息 --- ## request: |参数名|描述|请求类型|参数类型|备注| |:----:|:----:|:----:|:----:|:----:| | time_type | 时间类型 | query | string | `YEAR` or `MONTH` | | time_value | 时间值 | query | string | `YYYY` or `YYYY-MM`| ## response: ``` json { "time": "统计时间", "income_amount": "收入数额", "outgo_amount": "支出数额", "balance_amount": "结余数额" } """ query = self.get_queryset() result = query.values('bill_type').annotate( amount_sum=Sum('amount')).order_by() try: income_amount = result.get(bill_type=1)['amount_sum'] except Bills.DoesNotExist: income_amount = 0 try: outgo_amount = result.get(bill_type=0)['amount_sum'] except Bills.DoesNotExist: outgo_amount = 0 res_dict = { "time": request.query_params.get('time_value'), "income_amount": income_amount, "outgo_amount": outgo_amount, "balance_amount": income_amount - outgo_amount } return Response(data=res_dict, status=status.HTTP_200_OK) @action(methods=['GET'], detail=False) def linechart(self, request): """ 折线图 --- ## request: |参数名|描述|请求类型|参数类型|备注| |:----:|:----:|:----:|:----:|:----:| | time_type | 时间类型 | query | string | `YEAR` or `MONTH` | | time_value | 时间值 | query | string | `YYYY` or `YYYY-MM`| ## response: ``` json { "income": [ { "time": "YYYY-MM or YYYY-MM-DD", "total_amount": "金额" } ], "outgo": [ { "time": "YYYY-MM or YYYY-MM-DD", "total_amount": "金额" } ] } """ time_type = self.request.query_params.get('time_type') queryset = self.get_queryset() if time_type == 'YEAR': stat_income = queryset.filter(bill_type=1).extra({ 'time': "DATE_FORMAT(record_date,'%%Y-%%m')" }).values('time').annotate(total_amount=Sum('amount')).order_by() stat_outgo = queryset.filter(bill_type=0).extra({ 'time': "DATE_FORMAT(record_date,'%%Y-%%m')" }).values('time').annotate(total_amount=Sum('amount')).order_by() if time_type == 'MONTH': stat_income = queryset.filter(bill_type=1).extra({ 'time': "DATE_FORMAT(record_date,'%%Y-%%m-%%d')" }).values('time').annotate(total_amount=Sum('amount')).order_by() stat_outgo= queryset.filter(bill_type=0).extra({ 'time': "DATE_FORMAT(record_date,'%%Y-%%m-%%d')" }).values('time').annotate(total_amount=Sum('amount')).order_by() stat_income = list(stat_income) stat_outgo = list(stat_outgo) income_data = linechart_handle(time_type, stat_income) outgo_data = linechart_handle(time_type, stat_outgo) result = { "income": income_data, "outgo": outgo_data } return Response(status=status.HTTP_200_OK, data=result) @action(methods=['GET'], detail=False) def ringchart(self, request): """ 环状图 --- ## request: |参数名|描述|请求类型|参数类型|备注| |:----:|:----:|:----:|:----:|:----:| | time_type | 时间类型 | query | string | `YEAR` or `MONTH` | | time_value | 时间值 | query | string | `YYYY` or `YYYY-MM`| ## response: ``` json { "income": [ { "category": "类别", "amount": "金额" } ], "outgo": [ { "category": "类别", "amount": "金额" } ] } """ queryset = self.get_queryset() income_stat = queryset.filter(bill_type=1).values('category').annotate( amount_sum=Sum('amount')).order_by() outgo_stat = queryset.filter(bill_type=0).values('category').annotate( amount_sum=Sum('amount')).order_by() income_stat = list(income_stat) outgo_stat = list(outgo_stat) if income_stat: ringchart_handle(income_stat) if outgo_stat: ringchart_handle(outgo_stat) result = { 'income': income_stat, 'outgo': outgo_stat } return Response(status=status.HTTP_200_OK, data=result)
def __setEmptyLom(self): self.lom = { "title": "", "catalogentry": [], "language": [], "description": [], "keyword": [], "coverage": [], "structure": "", "aggregationlevel": "", "version": "", "status": "", "contribute": [], "metacatalogentry": [], "metacontribute": [], "metadatascheme": [], "metalanguage": "", "format": [], "size": "", "location": "", "duration": "", "educational": [], "cost": "", "copyrightandotherrestrictions": "", "copyrightdescription": "", "relation": [], "classification": [] }
/** * Handles an incoming packet.<br> * This will check if the UUID submitted is a UUID of a submitted ticket and * call the callback of that ticket. If this is not the case or is null, the * packet will be sent to the Packet Handler. * * @param uuid The UUID of an incoming response packet. * @param data The data of the incoming packet. */ public void handleIncoming(int id, UUID uuid, JSONObject data) { if (uuid != null && id == 5) { PacketTicket ticket = getTicket(uuid); if (ticket != null && ticket.getCallback() != null) { ticket.getCallback().setData(data); ticket.getCallback().callback(); this.tickets.remove(ticket); return; } } PacketHandler.handlePacket(id, data); }
A not terribly original (I'm guessing) concept sketch for a reborn flying city of Gnomeregan for thegame. Done from a blind start with no planning, so yes, it's probably terrible. With a bit of tweaking, it could work, though (with plenty of opportunities for sci-fi references).The outside is all metallic but from the inside certain parts (e.g., the dome) would look like clear or tinted glass. Possibly bigger on the inside than it is on the outside. The ears swivel and provide propulsion/lift. Mouth opens/closes. Could launch aircraft from there, but I don't know about landing them. Iron Gnomes. Helipad up top. A Zed gnome (reference). Maybe a-style mechagnome somewhere. A room with a portal to send you wherever you want to go (at least 85% of the time, anyway). Other stuff. Most of the profession trainers and a food court in the dome. The engineering section could be a mini science expo. Wish I could do a better write up, but my brain is currently like mud.The city would basically just fly around randomly, using a giant dimensional ripper to "blink" from one place to another. It could show up just about anywhere.The opposing side could be either just the back of a the male head of a female gnome face. I started a drawing based on that idea , but I wasn't thrilled with how it was looking so I stopped.
#!/usr/bin/env python3 import feedparser import youtube_dl import sys from pathlib import Path import argparse from time import time, mktime from datetime import datetime from dateutil.relativedelta import relativedelta from dateutil.parser import parse as dateparse from appdirs import AppDirs from icecream import ic import json ic.disable() if sys.version_info.major < 3 and sys.version_info.minor < 6: raise Exception("Must be using Python 3.6 or greater") xdgDirs = AppDirs("yt-dl-subs") if __name__ == "__main__": parser = argparse.ArgumentParser("Download YouTube subscriptions.") parser.add_argument( "--output-path", "-o", default=Path().home() / "Videos" / "YT Subs", help=f"The directory to which to save the videos. Default {Path().home() / 'Videos' / 'YT Subs'}", type=Path, ) parser.add_argument( "--retain", "-r", default=None, help="Retain videos up to the given number of days since today. Default: None", type=int, ) parser.add_argument( "--since", "-s", default=None, help="Only download videos newer than the given number of days. Default: None", type=int, ) parser.add_argument( "--config-path", "-c", default=Path(xdgDirs.user_config_dir), help=f"The directory to which config is saved. Default: {xdgDirs.user_config_dir}", type=Path, ) parser.add_argument( "--create-directories", default=False, help="Create all directories if they do not exist. Default: False", action="store_true", ) parser.add_argument( "--debug", "-d", default=False, help="Print some more verbose debugging info.", action="store_true", ) parser.add_argument( "--no-download", default=False, help="Set this to not download any videos.", action="store_true", ) parser.add_argument( "--quiet", "-q", default=False, help="Reduce the output.", action="store_true" ) args = parser.parse_args() if args.debug: ic.enable() ic(args) if args.retain and args.retain < args.since: print( "It is not a good idea to remove newer files than what you want to download." ) if input("Continue y/[n]: ").lower() != "y": quit() # The current run time. scriptStartTime = time() if isinstance(args.config_path, str): confDir = Path(args.config_path) else: confDir = args.config_path if isinstance(args.output_path, str): outputPath = Path(args.output_path) else: outputPath = args.output_path stateDir = Path(xdgDirs.user_state_dir) if args.create_directories: print("Creating directories.") if not confDir.exists(): confDir.mkdir(parents=True) if not outputPath.exists(): outputPath.mkdir(parents=True) if not stateDir.exists(): stateDir.mkdir(parents=True) exit() lastFile = stateDir / "last.time" subsJSONFile = confDir / "subscriptions.json" # Overrule the time from which to download video if we've been asked to # keep videos since a certain number of days ago. if args.since is not None: sinceTimestamp = datetime.now() - relativedelta(days=int(args.since)) ic("args.since is set", sinceTimestamp) elif not lastFile.exists(): lastFile.write_text(str(time())) sinceTimestamp = datetime.now() - relativedelta(days=7) ic("lastFile does not exist", str(time()), sinceTimestamp) else: sinceTimestamp = dateparse(lastFile.read_text()) ic("lastFile exists and is read", sinceTimestamp) tmpJSON = json.loads(subsJSONFile.read_text()) ic(len(tmpJSON)) ic(tmpJSON[0]) baseURL = "https://www.youtube.com/feeds/videos.xml?channel_id=" feedURLs = [ baseURL + item["snippet"]["resourceId"]["channelId"] for item in tmpJSON ] ic(feedURLs[:10]) # Nothing is purged by default if args.retain is not None: # Create a timestamp that points to a time a couple of days ago # using the retain option. retainTimestamp = datetime.now() - relativedelta(days=int(args.retain)) # Loop over all the files in the output directory and # remove all that are older that the retainTimestamp for video in outputPath.glob("**/*.*"): modifiedTime = datetime.fromtimestamp(video.stat().st_mtime) ic(modifiedTime) ic(retainTimestamp) if modifiedTime < retainTimestamp: if not args.quiet: print(f"Removing {str(video)}.") video.unlink() input("Removal is done.") # Loop over the feed URLs and get the latest uploads videoURLs = [] for i, url in enumerate(feedURLs): if not args.quiet: print(f"Parsing through channel {i + 1} of {len(feedURLs)}") ic("Feed URL:", url) feed = feedparser.parse(url) for item in feed["items"]: publishedDate = datetime.fromtimestamp(mktime(item["published_parsed"])) ic(publishedDate) if publishedDate > sinceTimestamp: ic(item["author"]) videoURLs.append(item["link"]) if len(videoURLs) == 0: print("No new video found") quit() elif not args.quiet: print(f"{len(videoURLs)} new videos found") ydl_opts = { "ignoreerrors": True, "quiet": args.quiet, "outtmpl": (outputPath / "%(uploader)s - %(title)s.%(ext)s").as_posix(), "format": "best", "verbose": args.debug, } ic(ydl_opts) if not args.debug and not args.quiet: print("Downloading the videos now...") if not args.no_download: with youtube_dl.YoutubeDL(ydl_opts) as ydl: ydl.download(videoURLs) elif not args.quiet: ic(videoURLs) print("This is a dry run, nothing is downloaded.") lastFile.write_text(str(datetime.now())) if not args.quiet: print("Finished downloading videos.")
<gh_stars>10-100 import Database.HaskellDB import TestConnect import Dp037.D3proj_users -- victor said that top, topPercent and union produce SQL errors opts = ODBCOptions{dsn="mysql-dp037", uid="dp037", pwd="<PASSWORD>"} withDB f = odbcConnect opts f q1 = do users <- table d3proj_users top 2 order [asc users xid] return users {- q2 = do users <- table d3proj_users topPercent 20 return users -} q3 = do users <- table d3proj_users order [desc users xid] top 2 return users pp = putStrLn . show . showSql printIds = mapM (\r -> putStrLn (r!xid)) tests db = do putStrLn "top:" pp q1 rs <- query db q1 printIds rs putStrLn "" -- putStrLn "topPercent:" -- pp q2 -- putStrLn "" putStrLn "union:" let u = q1 `union` q3 pp u rs <- query db u printIds rs putStrLn "" main = argConnect tests
// ScanFile is a convenience method that abstracts the details of the multi-step file upload and scan process. // Calling this method for a given file is equivalent to (1) manually initializing a file upload session, // (2) uploading all chunks of the file, (3) completing the upload, and (4) triggering a scan of the file. // // The maximum allowed ContentSizeBytes is dependent on the terms of your current // Nightfall usage plan agreement; check the Nightfall dashboard for more details. // // This method consumes the provided reader, but it does not close it; closing remains // the caller's responsibility. func (c *Client) ScanFile(ctx context.Context, request *ScanFileRequest) (*ScanFileResponse, error) { var cancel context.CancelFunc if request.Timeout > 0 { ctx, cancel = context.WithTimeout(ctx, request.Timeout) } else { ctx, cancel = context.WithCancel(ctx) } defer cancel() fileUpload, err := c.initFileUpload(ctx, &fileUploadRequest{FileSizeBytes: request.ContentSizeBytes}) if err != nil { return nil, err } err = c.doChunkedUpload(ctx, fileUpload, request.Content) if err != nil { return nil, err } err = c.completeFileUpload(ctx, fileUpload.ID) if err != nil { return nil, err } return c.scanUploadedFile(ctx, request, fileUpload.ID) }
def value(self, toks): stack = [] for tok in toks: if tok[0] == '%': value = int(tok[1:], 2) elif tok[0:2] == '0b': value = int(tok[2:], 2) elif tok[0] == '&': value = int(tok[1:], 16) elif tok[0:2] == '0x': value = int(tok[2:], 16) elif tok.isdigit() or (tok[0] == '-' and tok[1:].isdigit()): value = int(tok) elif tok[0] == '"' and tok[-1] == '"': value = tok[1:-1] elif tok[0] == "'" and tok[-1] == "'": value = tok else: value = tok stack.append(value) if not stack: return None if len(stack) == 1: return stack[0] return stack
China Remains EY’s Most Attractive Renewable Energy Country As US Stalls At #3 October 10th, 2017 by Joshua S Hill China has held on to its spot as EY’s most attractive renewable energy country after taking the top spot earlier this year, leaving room for India to step into second place as the United States stalled at third due to new political headwinds. EY published its latest Renewable energy country attractiveness index (RECAI) report on Tuesday, revealing that China has held on to its position as the world’s most attractive renewable energy market. China and India both overtook the United States in May’s RECAI report, dropping the US out of top spot for the first time since 2015. China maintains its top spot with unsurprising constant focus on renewable energy development and energy efficiency policies, but India’s position at second is now viewed by EY as “increasingly precarious” due to cancelled wind energy Power Purchase Agreements and steep declines in tariffs bid in recent auctions which have placed doubt over India’s 2022 target of 100 GW of solar PV. It is similarly unsurprising that the United States has remained in third spot — or, maybe, more surprising is the fact that the United States hasn’t fallen further. One wonders what EY’s thoughts of the United States would be if it was to include this week’s news that the US Environmental Protection Agency (EPA) has decided to “withdraw” President Obama’s Clean Power Plan. Overall, the United States has seen numerous rollbacks to Obama-era climate change policies — and EY has highlighted the decision made by the country’s International Trade Commission’s decision to rule in favor of Suniva and SolarWorld’s Section 201 trade complaint, which will likely lead to tariffs and price floors on solar PV technology imports. Interestingly, this 50th edition of EY’s RECAI shows Middle Eastern and North African countries climbing the index due to a surge in renewable energy activity hand-in-hand with a series of policy developments in those regions, as well as financing deals and tenders. Specifically, the International Finance Corporation approved $635 million for 500 MW of solar projects in Egypt; Saudi Arabia has invited bids for its first utility-scale wind farm — a 400 MW project; and Algeria has entered the top 40 RECAI for the first time thanks to its 4 GW solar tender. “The index highlights that government policy is pivotal in driving renewable energy development globally,” said Ben Warren, EY Global Power & Utilities Corporate Finance Leader and RECAI Chief Editor. “As it becomes increasingly clear that time is running out for legacy energy supply models, countries are vying for their place in a clean energy future. Collaboration with existing suppliers and innovative partners through partnerships and acquisitions holds the key to success in this new world.” South Korea has also jumped up four places to 29th despite regional instability, thanks to national plans to increase its share of renewables from 6.6% today up to 20% by 2030. Further, France is stepping closer to the top five, jumping up into sixth position with the announcement of new tenders and acquisitions, backed up by strong political support for the renewable energy industry by French President Macron.
package irondb // An Entry object represents a single database record. type Entry struct { Id int `json:"id"` Active bool `json:"active"` Title string `json:"title"` Url string `json:"url"` Username string `json:"username"` Passwords []string `json:"passwords"` Email string `json:"email"` Tags []string `json:"tags"` Notes string `json:"notes"` } // An ExportEntry object represents a database record prepared for export. type ExportEntry struct { Title string `json:"title"` Url string `json:"url"` Username string `json:"username"` Passwords []string `json:"passwords"` Email string `json:"email"` Tags []string `json:"tags"` Notes string `json:"notes"` } // NewEntry initializes a new Entry object. func NewEntry() *Entry { return &Entry{ Active: true, Passwords: make([]string, 0), Tags: make([]string, 0), } } // GetPassword returns the newest password from the passwords list. func (entry *Entry) GetPassword() string { if len(entry.Passwords) > 0 { return entry.Passwords[len(entry.Passwords) - 1] } return "" } // SetPassword appends a new password to the passsword list. func (entry *Entry) SetPassword(password string) { entry.Passwords = append(entry.Passwords, password) }
import { Component } from '@angular/core'; import { Observable } from 'rxjs/Observable'; import "rxjs/add/observable/timer"; import * as _ from 'lodash'; import { Howl } from 'howler'; import { PlayerlistService } from './playerlist.service'; import { BOOL_TYPE } from '@angular/compiler/src/output/output_ast'; @Component({ selector: 'app-root', templateUrl: './app.component.html', styleUrls: ['./app.component.scss'], providers: [PlayerlistService] }) export class AppComponent { title = 'app'; private playerList$: Observable<any>; private playerList: Array<any>; private sounds: Array<Howl>; private scheduler: Array<number>; private duration: Array<number> = []; private __current = 0; private __total = 0; private __currentTime = Date.now(); private __played = 0; private __end_time = Date.now(); private __upcoming_card = _.range(5); constructor(private playerListService: PlayerlistService) { this.playerList$ = playerListService.getList(); this.playerList$.subscribe(list => { this.sounds = _.map(list, sound => new Howl({ src: [sound.url], autoplay: false, loop: false, preload: false, html5: true, volume: 1 })); this.playerList = list; this.bootstrap(0); Observable.timer(0, 200).subscribe(() => { this.__currentTime = Date.now(); if (this.sounds[this.__current].state() == 'loaded') { let __pos = <number>this.sounds[this.__current].seek(); this.__played = __pos / this.duration[this.__current]; } }); }); } schedule() { this.__end_time = Date.now() + this.duration[this.__current] * 1000; this.scheduler = _.times(this.__current + 1, _.constant(Date.now())); for (let __index = this.__current + 1; __index < _.size(this.playerList); __index++) { this.scheduler.push(_.last(this.scheduler) + this.duration[__index - 1] * 1000); } } bootstrap(loading) { if (loading == _.size(this.playerList)) { this.sounds[0].once('load', () => { this.__total = _.size(this.playerList); this.schedule(); this.sounds[0].play(); this.sounds[0].once('end', () => this.onend()); }); this.sounds[0].load(); } else { this.sounds[loading].once('load', () => { this.duration.push(this.sounds[loading].duration()); this.sounds[loading].unload(); this.bootstrap(loading + 1); }); this.sounds[loading].load(); } } onend() { this.sounds[this.__current].stop(); this.sounds[this.__current].unload(); this.__current++; if (this.__current >= this.__total) { this.__current = 0; } this.sounds[this.__current].once('load', () => { this.schedule(); this.sounds[this.__current].play(); this.sounds[this.__current].once('end', () => this.onend()); }); this.sounds[this.__current].load(); } }
hist = [0 for i in range(100)] l = [] while True: try: hist[input() - 1] += 1 except: break; for k in filter(lambda x: max(hist) == x[1], enumerate(hist)): print k[0] + 1
print('Hello World') sauce = 'tomato' order_description = """ The client asked for a pizza with less cheese. Instead of cheese he would like to have more ham. """ if sauce.startswith('toma'): print("it's probably tomato...") else: print('not tomato') print(order_description)
def rate(e0, e1, n0, n1, nr, p): h0 = 0.5 ** (n0 * p) h1 = 0.5 ** (n1 * p) hr = 0.5 ** (nr * p) y = e0 * (h1 - hr) - e1 * (h0 - hr) return y
import { Route, RouterModule } from "@angular/router"; import { Lesson0Component } from "./lesson0/lesson0.component"; import { NgModule } from "@angular/core"; import { Lesson1Component } from "./lesson1/lesson1.component"; import { Lesson2Component } from "./lesson2/lesson2.component"; import { Lesson3Component } from "./lesson3/lesson3.component"; import { Lesson4Component } from "./lesson4/lesson4.component"; import { Lesson5Component } from "./lesson5/lesson5.component"; import { Lesson6Component } from "./lesson6/lesson6.component"; import { Lesson7Component } from "./lesson7/lesson7.component"; import { Lesson8Component } from "./lesson8/lesson8.component"; import { Lesson9Component } from "./lesson9/lesson9.component"; import { Lesson10Component } from "./lesson10/lesson10.component"; import { Lesson11Component } from "./lesson11/lesson11.component"; import { InstallComponent } from "./install/install.component"; import { Lesson12Component } from "./lesson12/lesson12.component"; import { Lesson13Component } from "./lesson13/lesson13.component"; import { Lesson14Component } from "./lesson14/lesson14.component"; import { ResourceComponent } from "./resource/resource.component"; import { Lesson15Component } from "./lesson15/lesson15.component"; import { Lesson16Component } from "./lesson16/lesson16.component"; import { Lesson17Component } from "./lesson17/lesson17.component"; import { Lesson18Component } from "./lesson18/lesson18.component"; import { Lesson19Component } from "./lesson19/lesson19.component"; import { IndexComponent } from "./index/index.component"; import { Lesson20Component } from "./lesson20/lesson20.component"; import { Lesson21Component } from "./lesson21/lesson21.component"; import { Lesson22Component } from "./lesson22/lesson22.component"; import { Lesson23Component } from "./lesson23/lesson23.component"; import { Lesson24Component } from "./lesson24/lesson24.component"; import { Lesson25Component } from "./lesson25/lesson25.component"; import { Lesson26Component } from "./lesson26/lesson26.component"; import { Lesson27Component } from "./lesson27/lesson27.component"; import { Lesson28Component } from "./lesson28/lesson28.component"; import { Lesson29Component } from "./lesson29/lesson29.component"; import { Lesson30Component } from "./lesson30/lesson30.component"; export const routes: Route[] = [ // { // path: 'index', // component: IndexComponent, // data: { title: 'Index' } // }, { path: "install", component: InstallComponent, data: { title: "Install" } }, { path: "lesson/0", component: Lesson0Component, data: { title: "Lesson 0" } }, { path: "lesson/1", component: Lesson1Component, data: { title: "Lesson 1" } }, { path: "lesson/2", component: Lesson2Component, data: { title: "Lesson 2" } }, { path: "lesson/3", component: Lesson3Component, data: { title: "Lesson 3" } }, { path: "lesson/4", component: Lesson4Component, data: { title: "Lesson 4" } }, { path: "lesson/5", component: Lesson5Component, data: { title: "Lesson 5" } }, { path: "lesson/6", component: Lesson6Component, data: { title: "Lesson 6" } }, { path: "lesson/7", component: Lesson7Component, data: { title: "Lesson 7" } }, { path: "lesson/8", component: Lesson8Component, data: { title: "Lesson 8" } }, { path: "lesson/9", component: Lesson9Component, data: { title: "Lesson 9" } }, { path: "lesson/10", component: Lesson10Component, data: { title: "Lesson 10" } }, { path: "lesson/11", component: Lesson11Component, data: { title: "Lesson 11" } }, { path: "lesson/12", component: Lesson12Component, data: { title: "Lesson 12" } }, { path: "lesson/13", component: Lesson13Component, data: { title: "Lesson 13" } }, { path: "lesson/14", component: Lesson14Component, data: { title: "Lesson 14" } }, { path: "lesson/15", component: Lesson15Component, data: { title: "Lesson 15" } }, { path: "lesson/16", component: Lesson16Component, data: { title: "Lesson 16" } }, { path: "lesson/17", component: Lesson17Component, data: { title: "Lesson 17" } }, { path: "lesson/18", component: Lesson18Component, data: { title: "Lesson 18" } }, { path: "lesson/19", component: Lesson19Component, data: { title: "Lesson 19" } }, { path: "lesson/20", component: Lesson20Component, data: { title: "Lesson 20" } }, { path: "lesson/21", component: Lesson21Component, data: { title: "Lesson 21" } }, { path: "lesson/22", component: Lesson22Component, data: { title: "Lesson 22" } }, { path: "lesson/23", component: Lesson23Component, data: { title: "Lesson 23" } }, { path: "lesson/24", component: Lesson24Component, data: { title: "Lesson 24" } }, { path: "lesson/25", component: Lesson25Component, data: { title: "Lesson 25" } }, { path: "lesson/26", component: Lesson26Component, data: { title: "Lesson 26" } }, { path: "lesson/27", component: Lesson27Component, data: { title: "Lesson 27" } }, { path: "lesson/28", component: Lesson28Component, data: { title: "Lesson 28" } }, { path: "lesson/29", component: Lesson29Component, data: { title: "Lesson 29" } }, { path: "lesson/30", component: Lesson30Component, data: { title: "Lesson 30" } }, { path: "Resources", component: ResourceComponent, data: { title: "Resources" } }, { path: "", redirectTo: "install", pathMatch: "full" } ]; @NgModule({ imports: [RouterModule.forRoot(routes)] }) export class LessonRouteModule {}
/// Parse and validate the input according to WebAssembly 1.0 rules. Returns true if the supplied input is valid. pub fn validate<T: AsRef<[u8]>>(input: T) -> Result<(), String> { let mut err = FizzyErrorBox::new(); let ret = unsafe { sys::fizzy_validate( input.as_ref().as_ptr(), input.as_ref().len(), err.as_mut_ptr(), ) }; if ret { debug_assert!(err.code() == 0); Ok(()) } else { debug_assert!(err.code() != 0); Err(err.message()) } }
<filename>PARTE_3/EX023/index2.py from time import sleep def contador(i,f,p): if i < f and p == 0: while i != f: print(i,end=' ',flush=True) i += 1 sleep(0.1) print('FIM') else: if i > f and p == 0: while i != f: print(i,end=' ',flush=True) i -= 1 sleep(0.1) print('FIM') elif i < f: for c in range(i,f,p): print(c,end=' ',flush=True) sleep(0.1) print('FIM') elif i > f: for c in range(i,f,-p): print(c,end=' ',flush=True) sleep(0.1) print('FIM') #PROGRAMA PRINCIPAL contador(10,1,2)
def start_shuffler(port=DEFAULT_SHUFFLER_PORT, analyzer_uri='localhost:%d' % DEFAULT_ANALYZER_SERVICE_PORT, use_memstore=False, erase_db=True, db_dir=SHUFFLER_TMP_DB_DIR, config_file=SHUFFLER_DEMO_CONFIG_FILE, private_key_pem_file=DEFAULT_SHUFFLER_PRIVATE_KEY_PEM, use_tls=False, tls_cert_file=LOCALHOST_TLS_CERT_FILE, tls_key_file=LOCALHOST_TLS_KEY_FILE, verbose_count=0, wait=True): print cmd = [SHUFFLER_PATH, "-port", str(port), "-private_key_pem_file", private_key_pem_file, "-analyzer_uri", analyzer_uri, "-config_file", config_file, "-logtostderr"] if use_tls: cmd.append("-tls") cmd.append("-cert_file=%s"%tls_cert_file) cmd.append("-key_file=%s"%tls_key_file) if verbose_count > 0: cmd.append("-v=%d"%verbose_count) if use_memstore: cmd.append("-use_memstore") else: cmd = cmd + ["-db_dir", db_dir] if erase_db: print "Erasing Shuffler's LevelDB store at %s." % db_dir shutil.rmtree(db_dir, ignore_errors=True) print "Starting the shuffler..." print return execute_command(cmd, wait)
def startgame(self, event): if not self.switch: self.switch = True
/** * BigQuery Data Parser that parse the BiqQuery query result */ public final class BigQueryDataParser { private BigQueryDataParser() { } public static List<StructuredRecord> parse(TableResult result) { List<StructuredRecord> samples = new ArrayList<>(); com.google.cloud.bigquery.Schema schema = result.getSchema(); Schema cdapSchema = BigQueryUtil.getTableSchema(schema, null); FieldList fields = schema.getFields(); for (FieldValueList fieldValues : result.iterateAll()) { StructuredRecord record = getStructuredRecord(cdapSchema, fields, fieldValues); samples.add(record); } return samples; } public static StructuredRecord getStructuredRecord(Schema schema, FieldList fields, FieldValueList fieldValues) { StructuredRecord.Builder recordBuilder = StructuredRecord.builder(schema); for (Field field: fields) { String fieldName = field.getName(); FieldValue fieldValue = fieldValues.get(fieldName); Attribute attribute = fieldValue.getAttribute(); if (fieldValue.isNull()) { recordBuilder.set(fieldName, null); continue; } Schema.Field localField = schema.getField(fieldName); Schema localSchema; if (localField != null) { localSchema = localField.getSchema(); } else { continue; } FieldList subFields = field.getSubFields(); if (attribute == Attribute.REPEATED) { // Process each field of the array and then add it to the StructuredRecord List<Object> list = new ArrayList<>(); List<FieldValue> fieldValueList = fieldValue.getRepeatedValue(); for (FieldValue localValue : fieldValueList) { // If the field contains multiple fields then we have to process it recursively. if (localValue.getValue() instanceof FieldValueList) { FieldValueList localFieldValueListNoSchema = localValue.getRecordValue(); FieldValueList localFieldValueList = FieldValueList.of(localFieldValueListNoSchema, subFields); StructuredRecord componentRecord = getStructuredRecord(localSchema.getComponentSchema(), subFields, localFieldValueList); list.add(componentRecord); } else { list.add(convertValue(field, localValue)); } } recordBuilder.set(fieldName, list); } else if (attribute == Attribute.RECORD) { // If the field contains a Record then we need to process each field independently FieldValue localValue = fieldValue; Object value; if (localValue.getValue() instanceof FieldValueList) { // If one of the fields is a record FieldValueList localFieldValueListNoSchema = localValue.getRecordValue(); FieldValueList localFieldValueList = FieldValueList.of(localFieldValueListNoSchema, subFields); value = getStructuredRecord(localSchema.getNonNullable(), subFields, localFieldValueList); } else { value = convertValue(field, localValue); } addToRecordBuilder(recordBuilder, fieldName, value); } else { Object value = convertValue(field, fieldValue); addToRecordBuilder(recordBuilder, fieldName, value); } } StructuredRecord record = recordBuilder.build(); return record; } /** * Checks the type value to add and calls the correct method in the StructureRecord Builder. * @param recordBuilder The StructureRecord builder that we will be calling * @param fieldName The name of the field to which the value has to be added * @param value The value to add. */ private static void addToRecordBuilder(StructuredRecord.Builder recordBuilder, String fieldName, Object value) { if (value instanceof ZonedDateTime) { recordBuilder.setTimestamp(fieldName, (ZonedDateTime) value); } else if (value instanceof LocalTime) { recordBuilder.setTime(fieldName, (LocalTime) value); } else if (value instanceof LocalDate) { recordBuilder.setDate(fieldName, (LocalDate) value); } else if (value instanceof LocalDateTime) { recordBuilder.setDateTime(fieldName, (LocalDateTime) value); } else if (value instanceof BigDecimal) { recordBuilder.setDecimal(fieldName, (BigDecimal) value); } else { recordBuilder.set(fieldName, value); } } /** * Convert BigQuery field value to CDAP field value * @param field BigQuery field * @param fieldValue BigQuery field value * @return the converted CDAP field value */ public static Object convertValue(Field field, FieldValue fieldValue) { LegacySQLTypeName type = field.getType(); StandardSQLTypeName standardType = type.getStandardType(); switch (standardType) { case TIME: return LocalTime.parse(fieldValue.getStringValue()); case DATE: return LocalDate.parse(fieldValue.getStringValue()); case TIMESTAMP: long tsMicroValue = fieldValue.getTimestampValue(); return getZonedDateTime(tsMicroValue); case NUMERIC: BigDecimal decimal = fieldValue.getNumericValue(); if (decimal.scale() < Numeric.SCALE) { // scale up the big decimal. this is because structured record expects scale to be exactly same as schema // Big Query supports maximum unscaled value up to Numeric.SCALE digits. so scaling up should // still be <= max precision decimal = decimal.setScale(Numeric.SCALE); } return decimal; case BIGNUMERIC: BigDecimal bigDecimal = fieldValue.getNumericValue(); if (bigDecimal.scale() < BigNumeric.SCALE) { // scale up the big decimal. this is because structured record expects scale to be exactly same as schema // Big Query Big Numeric supports maximum unscaled value up to BigNumeric.SCALE digits. // so scaling up should still be <= max precision bigDecimal = bigDecimal.setScale(BigNumeric.SCALE); } return bigDecimal; case DATETIME: return LocalDateTime.parse(fieldValue.getStringValue()); case STRING: return fieldValue.getStringValue(); case BOOL: return fieldValue.getBooleanValue(); case FLOAT64: return fieldValue.getDoubleValue(); case INT64: return fieldValue.getLongValue(); case BYTES: return fieldValue.getBytesValue(); default: throw new RuntimeException(String.format("BigQuery type %s is not supported.", standardType)); } } /** * Convert epoch microseconds to zoned date time * @param microTs the epoch microseconds * @return the converted zoned datte time */ public static ZonedDateTime getZonedDateTime(long microTs) { long tsInSeconds = TimeUnit.MICROSECONDS.toSeconds(microTs); long mod = TimeUnit.MICROSECONDS.convert(1, TimeUnit.SECONDS); int fraction = (int) (microTs % mod); Instant instant = Instant.ofEpochSecond(tsInSeconds, TimeUnit.MICROSECONDS.toNanos(fraction)); return ZonedDateTime.ofInstant(instant, ZoneId.ofOffset("UTC", ZoneOffset.UTC)); } }
/** Exposes a public API to perform operations in a ES instance. */ @Slf4j @NoArgsConstructor(access = AccessLevel.PRIVATE) public class EsIndex { /** * Creates an ES index. * * @param config configuration of the ES instance. * @param indexParams parameters to create the index * @return name of the index */ public static String createIndex(EsConfig config, IndexParams indexParams) { log.info("Creating index {}", indexParams.getIndexName()); try (EsClient esClient = EsClient.from(config)) { return EsService.createIndex(esClient, indexParams); } } /** * Creates an ES index if it doesn't exist another index with the same name. * * @param config configuration of the ES instance. * @param indexParams parameters to create the index * @return name of the index. Empty if the index already existed. */ public static Optional<String> createIndexIfNotExists(EsConfig config, IndexParams indexParams) { log.info("Creating index from params: {}", indexParams); try (EsClient esClient = EsClient.from(config)) { if (!EsService.existsIndex(esClient, indexParams.getIndexName())) { return Optional.of(EsService.createIndex(esClient, indexParams)); } } return Optional.empty(); } /** * Swaps an index in a aliases. * * <p>The index received will be the only index associated to the alias for the dataset after * performing this call. All the indexes that were associated to this alias before will be removed * from the ES instance. * * @param config configuration of the ES instance. * @param aliases aliases that will be modified. * @param index index to add to the aliases that will become the only index of the aliases for the * dataset. */ public static void swapIndexInAliases(EsConfig config, Set<String> aliases, String index) { Preconditions.checkArgument(aliases != null && !aliases.isEmpty(), "alias is required"); Preconditions.checkArgument(!Strings.isNullOrEmpty(index), "index is required"); swapIndexInAliases( config, aliases, index, Collections.emptySet(), Searching.getDefaultSearchSettings()); } /** * Swaps indexes in alias. It adds the given index to the alias and removes all the indexes that * match the dataset pattern plus some extra indexes that can be passed as parameter. * * @param config configuration of the ES instance. * @param aliases aliases that will be modified. * @param index index to add to the aliases that will become the only index of the alias for the * dataset. * @param extraIdxToRemove extra indexes to be removed from the aliases */ public static void swapIndexInAliases( EsConfig config, Set<String> aliases, String index, Set<String> extraIdxToRemove, Map<String, String> settings) { Objects.requireNonNull(aliases, "aliases are required"); Set<String> validAliases = aliases.stream().filter(alias -> !Strings.isNullOrEmpty(alias)).collect(Collectors.toSet()); Preconditions.checkArgument(!validAliases.isEmpty(), "aliases are required"); try (EsClient esClient = EsClient.from(config)) { Set<String> idxToAdd = new HashSet<>(); Set<String> idxToRemove = new HashSet<>(); // the index to add is optional Optional.ofNullable(index) .ifPresent( idx -> { idxToAdd.add(idx); // look for old indexes for this datasetId to remove them from the alias String datasetId = getDatasetIdFromIndex(idx); Optional.ofNullable( getIndexesByAliasAndIndexPattern( esClient, getDatasetIndexesPattern(datasetId), validAliases)) .ifPresent(idxToRemove::addAll); }); // add extra indexes to remove Optional.ofNullable(extraIdxToRemove).ifPresent(idxToRemove::addAll); log.info( "Removing indexes {} and adding index {} in aliases {}", idxToRemove, index, validAliases); // swap the indexes swapIndexes(esClient, validAliases, idxToAdd, idxToRemove); Optional.ofNullable(index) .ifPresent( idx -> { // change index settings to search settings updateIndexSettings(esClient, idx, settings); EsService.refreshIndex(esClient, idx); long count = EsService.countIndexDocuments(esClient, idx); log.info("Index {} added to alias {} with {} records", idx, validAliases, count); }); } } /** * Counts the number of documents of an index. * * @param config configuration of the ES instance. * @param index index to count the elements from. * @return number of documents of the index. */ public static long countDocuments(EsConfig config, String index) { Preconditions.checkArgument(!Strings.isNullOrEmpty(index), "index is required"); log.info("Counting documents from index {}", index); try (EsClient esClient = EsClient.from(config)) { return EsService.countIndexDocuments(esClient, index); } } /** * Connects to Elasticsearch instance and deletes records in an index by datasetId and returns the * indexes where the dataset was present. * * @param config configuration of the ES instance. * @param aliases aliases where we look for records of the dataset * @param datasetKey dataset whose records we are looking for * @param indexesToDelete filters the indexes whose records we want to delete, so we can ignore * some. E.g.: delete only from non-independent indexes. * @return datasets where we found records of this dataset */ public static Set<String> deleteRecordsByDatasetId( EsConfig config, String[] aliases, String datasetKey, Predicate<String> indexesToDelete, int timeoutSec, int attempts) { try (EsClient esClient = EsClient.from(config)) { // find indexes where the dataset is present Set<String> existingDatasetIndexes = findDatasetIndexesInAliases(esClient, aliases, datasetKey); if (existingDatasetIndexes == null || existingDatasetIndexes.isEmpty()) { return Collections.emptySet(); } // prepare parameters String query = String.format(DELETE_BY_DATASET_QUERY, datasetKey); // we only delete by query for the indexes specified String indexes = existingDatasetIndexes.stream().filter(indexesToDelete).collect(Collectors.joining(",")); if (!Strings.isNullOrEmpty(indexes)) { log.info("Deleting records from ES indexes {} with query {}", indexes, query); deleteRecordsByQueryAndWaitTillCompletion(esClient, indexes, query, timeoutSec, attempts); } return existingDatasetIndexes; } } @SneakyThrows private static void deleteRecordsByQueryAndWaitTillCompletion( EsClient esClient, String index, String query, int timeoutSec, int attempts) { Preconditions.checkArgument(!Strings.isNullOrEmpty(index), "index is required"); String taskId = EsService.deleteRecordsByQuery(esClient, index, query); DeleteByQueryTask task = EsService.getDeletedByQueryTask(esClient, taskId); while (!task.isCompleted() && attempts-- > 0) { TimeUnit.SECONDS.sleep(timeoutSec); task = EsService.getDeletedByQueryTask(esClient, taskId); } log.info("{} records deleted from ES index {}", task.getRecordsDeleted(), index); } /** * Connects to Elasticsearch instance and given index * * @param config configuration of the ES instance. * @param indexName name of the index to delete */ public static void deleteIndex(EsConfig config, String indexName) { try (EsClient esClient = EsClient.from(config)) { EsService.deleteIndex(esClient, indexName); } } /** * Finds the indexes in an alias where a given dataset is present. This method checks that the * aliases exist before querying ES. * * @param config configuration of the ES instance. * @param aliases name of the alias to search in. * @param datasetKey key of the dataset we are looking for. */ public static Set<String> findDatasetIndexesInAliases( EsConfig config, String[] aliases, String datasetKey) { try (EsClient esClient = EsClient.from(config)) { return findDatasetIndexesInAliases(esClient, aliases, datasetKey); } } private static Set<String> findDatasetIndexesInAliases( EsClient esClient, String[] aliases, String datasetKey) { Preconditions.checkArgument(aliases != null && aliases.length > 0, "aliases are required"); Preconditions.checkArgument(!Strings.isNullOrEmpty(datasetKey), "datasetKey is required"); // we check if the aliases exist, otherwise ES throws an error. String existingAlias = Arrays.stream(aliases) .filter(alias -> EsService.existsIndex(esClient, alias)) .collect(Collectors.joining(",")); return EsService.findDatasetIndexesInAlias(esClient, existingAlias, datasetKey); } private static String getDatasetIndexesPattern(String datasetId) { return datasetId + INDEX_SEPARATOR + "*"; } private static String getDatasetIdFromIndex(String index) { List<String> pieces = Arrays.asList(index.split(INDEX_SEPARATOR)); if (pieces.size() < 2) { log.error("Index {} doesn't follow the pattern \"{datasetId}_{attempt}\"", index); throw new IllegalArgumentException( "index has to follow the pattern \"{datasetId}_{attempt}\""); } return pieces.get(0); } }
/** * Forcefully shuts down this TransferManager instance - currently executing transfers will not * be allowed to finish. Callers should use this method when they either: * <ul> * <li>have already verified that their transfers have completed by checking each transfer's * state * <li>need to exit quickly and don't mind stopping transfers before they complete. * </ul> * <p> * Callers should also remember that uploaded parts from an interrupted upload may not always be * automatically cleaned up, but callers can use {@link #abortMultipartUploads(String, Date)} to * clean up any upload parts. * * @param shutDownCOSClient Whether to shut down the underlying Qcloud COS client. */ public void shutdownNow(boolean shutDownCOSClient) { if (shutDownThreadPools) { threadPool.shutdownNow(); timedThreadPool.shutdownNow(); } if (shutDownCOSClient) { if (cos instanceof COSClient) { ((COSClient) cos).shutdown(); } } }
/** * Created by yindp on 4/22/17. */ public class EmployeeDAOImpl implements EmployeeDAOInte { @Override public Employee load(String id) { Session session = null; Employee employee = null; try { session = MySessionFactory.openSession(); employee = session.load(Employee.class, id); } catch (Exception e) { e.printStackTrace(); } finally { MySessionFactory.close(session); } return employee; } }
def shutdown(self): self.display("Shutting down service") self.http_server.stop() io_loop = tornado.ioloop.IOLoop.instance() io_loop.add_timeout(time.time() + 2, io_loop.stop) self.display("httphq is down")
package org.jabref.logic.formatter.bibtexfields; import java.util.Objects; import java.util.regex.Pattern; import org.jabref.logic.l10n.Localization; import org.jabref.model.cleanup.Formatter; /** * Removes all hyphenated line breaks in the string. */ public class RemoveHyphenatedNewlinesFormatter implements Formatter { private static final Pattern HYPHENATED_WORDS = Pattern.compile("(-\r\n|-\n|-\r)"); @Override public String getName() { return Localization.lang("Remove hyphenated line breaks"); } @Override public String getKey() { return "remove_hyphenated_newlines"; } @Override public String format(String value) { Objects.requireNonNull(value); value = HYPHENATED_WORDS.matcher(value).replaceAll(""); return value.trim(); } @Override public String getDescription() { return Localization.lang("Removes all hyphenated line breaks in the field content."); } @Override public String getExampleInput() { return "Gimme shel-\nter"; } }
/** */ package logicalSpecification; import org.eclipse.emf.ecore.EObject; /** * <!-- begin-user-doc --> * A representation of the model object '<em><b>Operator</b></em>'. * <!-- end-user-doc --> * * <p> * The following features are supported: * </p> * <ul> * <li>{@link logicalSpecification.Operator#getFolspecification <em>Folspecification</em>}</li> * <li>{@link logicalSpecification.Operator#getForAlloperator <em>For Alloperator</em>}</li> * <li>{@link logicalSpecification.Operator#getNotOperator <em>Not Operator</em>}</li> * <li>{@link logicalSpecification.Operator#getAndOperator <em>And Operator</em>}</li> * <li>{@link logicalSpecification.Operator#getOrOperator <em>Or Operator</em>}</li> * <li>{@link logicalSpecification.Operator#getExistsOperator <em>Exists Operator</em>}</li> * </ul> * * @see logicalSpecification.LogicalSpecificationPackage#getOperator() * @model abstract="true" * @generated */ public interface Operator extends EObject { /** * Returns the value of the '<em><b>Folspecification</b></em>' container reference. * It is bidirectional and its opposite is '{@link logicalSpecification.FOLSpecification#getRootOperator <em>Root Operator</em>}'. * <!-- begin-user-doc --> * <p> * If the meaning of the '<em>Folspecification</em>' container reference isn't clear, * there really should be more of a description here... * </p> * <!-- end-user-doc --> * @return the value of the '<em>Folspecification</em>' container reference. * @see #setFolspecification(FOLSpecification) * @see logicalSpecification.LogicalSpecificationPackage#getOperator_Folspecification() * @see logicalSpecification.FOLSpecification#getRootOperator * @model opposite="rootOperator" transient="false" * @generated */ FOLSpecification getFolspecification(); /** * Sets the value of the '{@link logicalSpecification.Operator#getFolspecification <em>Folspecification</em>}' container reference. * <!-- begin-user-doc --> * <!-- end-user-doc --> * @param value the new value of the '<em>Folspecification</em>' container reference. * @see #getFolspecification() * @generated */ void setFolspecification(FOLSpecification value); /** * Returns the value of the '<em><b>For Alloperator</b></em>' container reference. * It is bidirectional and its opposite is '{@link logicalSpecification.ForAllOperator#getArgument <em>Argument</em>}'. * <!-- begin-user-doc --> * <p> * If the meaning of the '<em>For Alloperator</em>' container reference isn't clear, * there really should be more of a description here... * </p> * <!-- end-user-doc --> * @return the value of the '<em>For Alloperator</em>' container reference. * @see #setForAlloperator(ForAllOperator) * @see logicalSpecification.LogicalSpecificationPackage#getOperator_ForAlloperator() * @see logicalSpecification.ForAllOperator#getArgument * @model opposite="argument" transient="false" * @generated */ ForAllOperator getForAlloperator(); /** * Sets the value of the '{@link logicalSpecification.Operator#getForAlloperator <em>For Alloperator</em>}' container reference. * <!-- begin-user-doc --> * <!-- end-user-doc --> * @param value the new value of the '<em>For Alloperator</em>' container reference. * @see #getForAlloperator() * @generated */ void setForAlloperator(ForAllOperator value); /** * Returns the value of the '<em><b>Not Operator</b></em>' container reference. * It is bidirectional and its opposite is '{@link logicalSpecification.NotOperator#getArgument <em>Argument</em>}'. * <!-- begin-user-doc --> * <p> * If the meaning of the '<em>Not Operator</em>' container reference isn't clear, * there really should be more of a description here... * </p> * <!-- end-user-doc --> * @return the value of the '<em>Not Operator</em>' container reference. * @see #setNotOperator(NotOperator) * @see logicalSpecification.LogicalSpecificationPackage#getOperator_NotOperator() * @see logicalSpecification.NotOperator#getArgument * @model opposite="argument" transient="false" * @generated */ NotOperator getNotOperator(); /** * Sets the value of the '{@link logicalSpecification.Operator#getNotOperator <em>Not Operator</em>}' container reference. * <!-- begin-user-doc --> * <!-- end-user-doc --> * @param value the new value of the '<em>Not Operator</em>' container reference. * @see #getNotOperator() * @generated */ void setNotOperator(NotOperator value); /** * Returns the value of the '<em><b>And Operator</b></em>' container reference. * It is bidirectional and its opposite is '{@link logicalSpecification.AndOperator#getArguments <em>Arguments</em>}'. * <!-- begin-user-doc --> * <p> * If the meaning of the '<em>And Operator</em>' container reference isn't clear, * there really should be more of a description here... * </p> * <!-- end-user-doc --> * @return the value of the '<em>And Operator</em>' container reference. * @see #setAndOperator(AndOperator) * @see logicalSpecification.LogicalSpecificationPackage#getOperator_AndOperator() * @see logicalSpecification.AndOperator#getArguments * @model opposite="arguments" transient="false" * @generated */ AndOperator getAndOperator(); /** * Sets the value of the '{@link logicalSpecification.Operator#getAndOperator <em>And Operator</em>}' container reference. * <!-- begin-user-doc --> * <!-- end-user-doc --> * @param value the new value of the '<em>And Operator</em>' container reference. * @see #getAndOperator() * @generated */ void setAndOperator(AndOperator value); /** * Returns the value of the '<em><b>Or Operator</b></em>' container reference. * It is bidirectional and its opposite is '{@link logicalSpecification.OrOperator#getArguments <em>Arguments</em>}'. * <!-- begin-user-doc --> * <p> * If the meaning of the '<em>Or Operator</em>' container reference isn't clear, * there really should be more of a description here... * </p> * <!-- end-user-doc --> * @return the value of the '<em>Or Operator</em>' container reference. * @see #setOrOperator(OrOperator) * @see logicalSpecification.LogicalSpecificationPackage#getOperator_OrOperator() * @see logicalSpecification.OrOperator#getArguments * @model opposite="arguments" transient="false" * @generated */ OrOperator getOrOperator(); /** * Sets the value of the '{@link logicalSpecification.Operator#getOrOperator <em>Or Operator</em>}' container reference. * <!-- begin-user-doc --> * <!-- end-user-doc --> * @param value the new value of the '<em>Or Operator</em>' container reference. * @see #getOrOperator() * @generated */ void setOrOperator(OrOperator value); /** * Returns the value of the '<em><b>Exists Operator</b></em>' container reference. * It is bidirectional and its opposite is '{@link logicalSpecification.ExistsOperator#getArgument <em>Argument</em>}'. * <!-- begin-user-doc --> * <p> * If the meaning of the '<em>Exists Operator</em>' container reference isn't clear, * there really should be more of a description here... * </p> * <!-- end-user-doc --> * @return the value of the '<em>Exists Operator</em>' container reference. * @see #setExistsOperator(ExistsOperator) * @see logicalSpecification.LogicalSpecificationPackage#getOperator_ExistsOperator() * @see logicalSpecification.ExistsOperator#getArgument * @model opposite="argument" transient="false" * @generated */ ExistsOperator getExistsOperator(); /** * Sets the value of the '{@link logicalSpecification.Operator#getExistsOperator <em>Exists Operator</em>}' container reference. * <!-- begin-user-doc --> * <!-- end-user-doc --> * @param value the new value of the '<em>Exists Operator</em>' container reference. * @see #getExistsOperator() * @generated */ void setExistsOperator(ExistsOperator value); boolean guarantees(Operator op2); } // Operator
#[macro_use] mod macros; mod decoder; mod dns; mod domain_name; mod error; mod helpers; mod question; mod rr; #[cfg(test)] mod tests; use decoder::Decoder; pub use error::DecodeError; pub type DecodeResult<T> = std::result::Result<T, DecodeError>;
<reponame>GinoCardillo-OST/Visual-OO-Debugger import { EscapedString } from './escapedString'; import { StructureId } from './structureId'; /** * Fields represent either parameters and local variables (both on stack * frames) or attributes on a Java object. */ export interface Field { parentId: StructureId; type?: string; name: string; value: EscapedString; }
def save(self, force_insert=False, force_update=False, *args, **kwargs): reload = kwargs.pop('no_reload', False) ret = super().save(force_insert, force_update, *args, **kwargs) if not reload: if self.active is False and self.__org_active is True: plugin_reg.reload_plugins() elif self.active is True and self.__org_active is False: plugin_reg.reload_plugins() return ret
<filename>simulation_experiment/utils.py import numpy as np import matplotlib.pyplot as plt import torch def eval_cevae(CEVAE, data_te, loss_dict): y_te = torch.Tensor(data_te[:, 0]) a_te = torch.Tensor(data_te[:, 1]) x_te = torch.Tensor(data_te[:, 3:]) _, loss_dict = CEVAE.forward(x_te, a_te, y_te, loss_dict, evaluate=True) torch.save(CEVAE.state_dict(), 'output/cevae_model') return loss_dict def train_cevae(CEVAE, args, optimizer, data_tr, data_te, loss_dict): print(' - training CEVAE') # Training loop y = torch.Tensor(data_tr[:, 0]) a = torch.Tensor(data_tr[:, 1]) x = torch.Tensor(data_tr[:, 3:]) loss_dict = eval_cevae(CEVAE, data_te, loss_dict) for epoch_i in range(args.epochs): iter_per_epoch = np.floor(y.shape[0]/args.batchSize).astype(int) for iter_i in range(iter_per_epoch): batch_idx = np.random.choice(list(range(x.shape[0])), args.batchSize, replace=False) # select data for batch x_batch = torch.Tensor(x[batch_idx]) a_batch = torch.Tensor(a[batch_idx]) y_batch = torch.Tensor(y[batch_idx]) neg_lowerbound, loss_dict = CEVAE.forward(x_batch, a_batch, y_batch, loss_dict) optimizer.zero_grad() # Calculate gradients neg_lowerbound.backward() # Update step optimizer.step() loss_dict = eval_cevae(CEVAE, data_te, loss_dict) return loss_dict def plot_data(data, reconstruct=False): idx_a0 = np.where(data[:, 1] == 0) idx_a1 = np.where(data[:, 1] == 1) # Plot overview x distribution plt.figure(figsize=(5, 4), dpi=100) # plt.suptitle('Distribution x, conditioned on a') # loop through 5 x variables num_x = int(data.shape[1]-3) for i in range(num_x): plt.subplot(num_x, 1, i + 1) if i == (num_x-1): plt.xlabel('x value') plt.ylabel('Hist x' + str(i + 1)) plt.xlim(-5, 5) label0, label1 = 'a0', 'a1' if reconstruct: label0 = 'recon a0' label1 = 'recon a1' plt.hist(data[idx_a0, 3 + i][0], histtype='step', label=label0, bins=50, color='firebrick') plt.hist(data[idx_a1, 3 + i][0], histtype='step', label=label1, bins=50, color='navy') plt.legend() plt.tight_layout() if reconstruct: plt.savefig('output/data_x_recon') else: plt.savefig('output/data_x') plt.show() # plot y distributions plt.figure(figsize=(5, 4), dpi=100) # plt.suptitle('Distribution Y, conditioned on a') plt.subplot(2, 1, 1) plt.hist(data[idx_a0, 0][0], label='a0', histtype='step', alpha=0.6, color='firebrick') plt.legend() plt.subplot(2, 1, 2) plt.hist(data[idx_a1, 0][0], label='a1', histtype='step', alpha=0.6, color='navy') plt.legend() if reconstruct: plt.savefig('output/data_y_recon') else: plt.savefig('output/data_y') plt.show() plt.close() def plot_loss(loss_dict): plt.figure(figsize=(16, 8), dpi=100) subplot = 1 # divide by 2, because every kind has a training an evaluation entry which should be plotted as one max_subplot = len(loss_dict)/2 for key, value in loss_dict.items(): if key[:5] != 'eval_': plt.subplot(max_subplot, 1, subplot) # calculate how many training points there are per evaluation point train_test_obs_ratio = np.floor(len(value)/len(loss_dict['eval_' + key])) test_iter = [train_test_obs_ratio * i for i in range(len(loss_dict['eval_' + key]))] plt.plot(value, color='blue', label='train') plt.plot(test_iter, loss_dict['eval_' + key], color='red', label='test') plt.xlabel('iteration') plt.ylabel('log probability') plt.legend() plt.title(key) subplot += 1 plt.tight_layout() plt.savefig('output/loss') # plt.show()
/** * Decoder for dictionary. Uses main decoder cause any other type can be nested * Created by iurii.dziuban on 02/06/2017. */ public class DictionaryDecoder implements Decoder<Map<ByteString, Object>> { private ByteStringDecoder byteStringDecoder = new ByteStringDecoder(); private final Decoder<Object> compositeDecoder; public DictionaryDecoder(Decoder<Object> compositeDecoder) { this.compositeDecoder = compositeDecoder; } @Override public Map<ByteString, Object> decode(Iterator<Character> iterator, String s) { Map<ByteString, Object> map = new TreeMap<>(); char character; while(iterator.hasNext() && (character = iterator.next()) != 'e') { ByteString key = byteStringDecoder.decode(iterator, "" + character); Object value = compositeDecoder.decode(iterator, ""); if (value == null) { throw new IllegalStateException("Unexpected end of Bencode data"); } map.put(key, value); } return map; } }
//----------------------------------------------------------------------------- // Purpose: Loads a face chunk from the VMF file. // Input : pFile - Chunk file being loaded. // Output : Returns ChunkFile_Ok or an error if there was a parsing error. //----------------------------------------------------------------------------- ChunkFileResult_t CMapFace::LoadVMF(CChunkFile *pFile) { SignalUpdate(EVTYPE_FACE_CHANGED); CChunkHandlerMap Handlers; Handlers.AddHandler("dispinfo", (ChunkHandler_t) LoadDispInfoCallback, this); LoadFace_t LoadFace; memset(&LoadFace, 0, sizeof(LoadFace)); LoadFace.pFace = this; pFile->PushHandlers(&Handlers); ChunkFileResult_t eResult = pFile->ReadChunk((KeyHandler_t) LoadKeyCallback, &LoadFace); pFile->PopHandlers(); if (eResult == ChunkFile_Ok) { CalcPlane(); SetTexture(LoadFace.szTexName); } return (eResult); }
<reponame>jonathan-s/djangocms-navigation from django.contrib.auth.models import Permission from django.shortcuts import reverse from django.test import TestCase from django.test.client import RequestFactory from cms.middleware.toolbar import ToolbarMiddleware from cms.toolbar.items import SideframeItem from cms.toolbar.toolbar import CMSToolbar from djangocms_navigation.cms_toolbars import NavigationToolbar from djangocms_navigation.test_utils import factories class TestCMSToolbars(TestCase): def _get_page_request(self, page, user): request = RequestFactory().get("/") request.session = {} request.user = user request.current_page = page mid = ToolbarMiddleware() mid.process_request(request) if hasattr(request, "toolbar"): request.toolbar.populate() return request def _get_toolbar(self, content_obj, user=None, **kwargs): """Helper method to set up the toolbar """ if not user: user = factories.UserFactory(is_staff=True) request = self._get_page_request( page=content_obj.page if content_obj else None, user=user ) cms_toolbar = CMSToolbar(request) toolbar = NavigationToolbar( cms_toolbar.request, toolbar=cms_toolbar, is_current_app=True, app_path="/" ) toolbar.toolbar.set_object(content_obj) return toolbar def _find_menu_item(self, name, toolbar): for left_item in toolbar.get_left_items(): for menu_item in left_item.items: try: if menu_item.name == name: return menu_item # Break item has no attribute `name` except AttributeError: pass def test_navigation_menu_added_to_admin_menu(self): user = factories.UserFactory() user.user_permissions.add( Permission.objects.get( content_type__app_label="djangocms_navigation", codename="change_menucontent", ) ) page_content = factories.PageContentFactory() toolbar = self._get_toolbar(page_content, preview_mode=True, user=user) toolbar.populate() toolbar.post_template_populate() cms_toolbar = toolbar.toolbar navigation_menu_item = self._find_menu_item("Navigation...", cms_toolbar) url = reverse("admin:djangocms_navigation_menucontent_changelist") self.assertIsNotNone(navigation_menu_item) self.assertIsInstance(navigation_menu_item, SideframeItem) self.assertEqual(navigation_menu_item.url, url) def test_navigation_menu_not_added_to_admin_menu_if_user_doesnt_have_permissions(self): user = factories.UserFactory() page_content = factories.PageContentFactory() toolbar = self._get_toolbar(page_content, preview_mode=True, user=user) toolbar.populate() toolbar.post_template_populate() cms_toolbar = toolbar.toolbar navigation_menu_item = self._find_menu_item("Navigation...", cms_toolbar) self.assertIsNone(navigation_menu_item)
Writing is a tough gig. Doing it well? Even harder. And while I’ve been doing it for a while now, I’ll be the first to admit that my ability to string together words is far from polished and still a work in progress. As a self-taught writer, I suppose that comes with the territory. Journalism, I’ve come to find, is the toughest type of writing. Not only do you have to be a good writer, but you also need to do it quickly to stay relevant and — perhaps the more difficult task — ideally be the first to do so. The idea behind that being, once you’re able to regularly deliver breaking news first, you’re the someone who everyone will want to turn to in the future. Eyeballs equal dollars, after all. Being able to pull all of that off is unquestionably a difficult task, and that’s one of the many reasons you see so few people making a living from the job. I’m not there… yet, at least. That said, when I see a good piece of journalism, I’m not afraid to admit it. Take for example the latest piece from the Sporting News‘ Brian Straus, a bombshell of exclusive reporting on the fractured relationship between the US Men’s National Team and head coach Jürgen Klinsmann. It revealed insider information directly from the mouths of the players and those close to them, confirming the fears that many fans and pundits alike had about Klinsmann’s spell in charge. In short, confidence is lacking both in the German’s tactical acumen and his ability to take the team forward. Clearly, Brian Straus put in long hours at the office to put this thing together. Getting people with exclusive information — such as those with intimate knowledge of the inner workings and thoughts of a national team — is a really tough task. Nobody wants to be made as the one who leaked information, the nark or appear as if they’re sabotaging the team. Straus, however, got twenty-two people to talk, and a monumentally larger effort was necessary to get that many people spilling their guts. But despite all of the work that clearly went into it, what I found odd about the entire piece was its timing. With just a few days before a pair of crucial World Cup Qualifiers, the shockwaves from Straus’ article were felt in every corner of the American soccer community. What exactly did Straus hope would come from publishing this at the time that he did? Was he trying to pressure Klinsmann, US Soccer president Sunil Gulati or the players into some sort of responsive action? Because if you ask me, the only type of pressure this places on them is the wrong kind. It’s the kind that breeds malcontent, finger-pointing and accusations, especially since everyone is now aware of the elephant in the room. Was he hoping that it would just get the conversation started, so that they could all sit down and talk it out, and then conduct a big bro hug at the end? That seems a pretty unlikely end result. And what of us supporters? We were already a fan base on edge thanks to a nervy semi-final round and an opening loss in the Hexagonal to Honduras. And with a number of marquee players missing through either injury, self-imposed exile, or simply being left off the roster, a vocal contingent of supporters were already electronically-screaming for Klinsmann’s head on a plate. At a time where the USMNT desperately needs our support, Straus’ article did nothing but pour gas on the fire and then fan the flames of discontent. So given it’s timing, if nothing but negatives could really be taken away from publishing an article, why would a respected journalist like Brian Straus drop a bomb he knew would disrupt things in an already tumultuous national team environment? Couldn’t he have just dropped it a week from now when they wouldn’t have near the negative impact? Well, it all goes back to eyeballs. Imagine he had waited to publish it after the qualifiers: what if the national team managed two positive results? It could have rendered his arguments — and the month of work needed to produce them — completely moot. Nobody wants to read an article about dissension within the USMNT camp if everything appears to be running smoothly. You see, readers want confirmations that all their fears are true while they’re experiencing that fear. The nervousness and uncertainty ahead of the Costa Rica and Mexico matches is the perfect environment to drop a bomb of drastic consequences; delaying it’s release ran the risk of completing all of that work for nothing. In short, Strauss exploded the National Team’s internal problems at an extremely poor time just to get more eyes on his story. And if you recall, more eyeballs means more advertising dollars in his and Sporting News‘ pockets. Drama sells. Totally worth it, right? Personally, I think the hypothesized argument for their timing is a poor one. He could have easily spun all of those quotes and thoughts into a positive piece had the national team gotten the results they so desperately need. Negative critiques and worries could have been used as introductions into how Klinsmann and his staff solved those problems. Sure it wouldn’t have been as influential, but it still would have made for a good article. And if the national team stunk it up, he could have left the article as is and run it after the qualifiers, and it still would have been impactful. That said, maybe I’ve got it all wrong. After all, journalism and writing are a tough gig, and Straus’ intentions could have been entirely different. However, if things go poorly over the next two games for the US National Team, I hope Straus is prepared to accept that his writing might have played a small part in bringing down the ship… even if that’s not what was intended. Advertisements
package org.firstinspires.ftc.teamcode.teamcode.prometheus.lib; public class Angle { private double radians; public Angle(){ radians = 0; } public Angle(Angle angle){ radians = angle.rad(); } public Angle(double radians){ this.radians = radians; } public Angle setRadians(double radians){ this.radians = radians; return this; } public Angle setDegrees(double degrees){ this.radians = Math.toRadians(degrees); return this; } public double getRadians(){ return radians; } public double getDegrees(){ return Math.toDegrees(radians); } //Shorter versions public Angle rad(double radians){ return setRadians(radians); } public double rad(){ return getRadians(); } public Angle deg(double degrees){ return setDegrees(degrees); } public double deg(){ return getDegrees(); } public Angle copy(){ return new Angle(radians); } public Angle negative(){ return new Angle(-radians); } }
/* * Driver for Digigram VXpocket soundcards * * lowlevel routines for VXpocket soundcards * * Copyright (c) 2002 by <NAME> <<EMAIL>> * * This program is free software; you can redistribute it and/or modify * it under the terms of the GNU General Public License as published by * the Free Software Foundation; either version 2 of the License, or * (at your option) any later version. * * This program is distributed in the hope that it will be useful, * but WITHOUT ANY WARRANTY; without even the implied warranty of * MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the * GNU General Public License for more details. * * You should have received a copy of the GNU General Public License * along with this program; if not, write to the Free Software * Foundation, Inc., 59 Temple Place, Suite 330, Boston, MA 02111-1307 USA */ #include <sound/driver.h> #include <linux/delay.h> #include <linux/device.h> #include <linux/firmware.h> #include <sound/core.h> #include <asm/io.h> #include "vxpocket.h" static int vxp_reg_offset[VX_REG_MAX] = { [VX_ICR] = 0x00, // ICR [VX_CVR] = 0x01, // CVR [VX_ISR] = 0x02, // ISR [VX_IVR] = 0x03, // IVR [VX_RXH] = 0x05, // RXH [VX_RXM] = 0x06, // RXM [VX_RXL] = 0x07, // RXL [VX_DMA] = 0x04, // DMA [VX_CDSP] = 0x08, // CDSP [VX_LOFREQ] = 0x09, // LFREQ [VX_HIFREQ] = 0x0a, // HFREQ [VX_DATA] = 0x0b, // DATA [VX_MICRO] = 0x0c, // MICRO [VX_DIALOG] = 0x0d, // DIALOG [VX_CSUER] = 0x0e, // CSUER [VX_RUER] = 0x0f, // RUER }; static inline unsigned long vxp_reg_addr(struct vx_core *_chip, int reg) { struct snd_vxpocket *chip = (struct snd_vxpocket *)_chip; return chip->port + vxp_reg_offset[reg]; } /* * snd_vx_inb - read a byte from the register * @offset: register offset */ static unsigned char vxp_inb(struct vx_core *chip, int offset) { return inb(vxp_reg_addr(chip, offset)); } /* * snd_vx_outb - write a byte on the register * @offset: the register offset * @val: the value to write */ static void vxp_outb(struct vx_core *chip, int offset, unsigned char val) { outb(val, vxp_reg_addr(chip, offset)); } /* * redefine macros to call directly */ #undef vx_inb #define vx_inb(chip,reg) vxp_inb((struct vx_core *)(chip), VX_##reg) #undef vx_outb #define vx_outb(chip,reg,val) vxp_outb((struct vx_core *)(chip), VX_##reg,val) /* * vx_check_magic - check the magic word on xilinx * * returns zero if a magic word is detected, or a negative error code. */ static int vx_check_magic(struct vx_core *chip) { unsigned long end_time = jiffies + HZ / 5; int c; do { c = vx_inb(chip, CDSP); if (c == CDSP_MAGIC) return 0; msleep(10); } while (time_after_eq(end_time, jiffies)); snd_printk(KERN_ERR "cannot find xilinx magic word (%x)\n", c); return -EIO; } /* * vx_reset_dsp - reset the DSP */ #define XX_DSP_RESET_WAIT_TIME 2 /* ms */ static void vxp_reset_dsp(struct vx_core *_chip) { struct snd_vxpocket *chip = (struct snd_vxpocket *)_chip; /* set the reset dsp bit to 1 */ vx_outb(chip, CDSP, chip->regCDSP | VXP_CDSP_DSP_RESET_MASK); vx_inb(chip, CDSP); mdelay(XX_DSP_RESET_WAIT_TIME); /* reset the bit */ chip->regCDSP &= ~VXP_CDSP_DSP_RESET_MASK; vx_outb(chip, CDSP, chip->regCDSP); vx_inb(chip, CDSP); mdelay(XX_DSP_RESET_WAIT_TIME); } /* * reset codec bit */ static void vxp_reset_codec(struct vx_core *_chip) { struct snd_vxpocket *chip = (struct snd_vxpocket *)_chip; /* Set the reset CODEC bit to 1. */ vx_outb(chip, CDSP, chip->regCDSP | VXP_CDSP_CODEC_RESET_MASK); vx_inb(chip, CDSP); msleep(10); /* Set the reset CODEC bit to 0. */ chip->regCDSP &= ~VXP_CDSP_CODEC_RESET_MASK; vx_outb(chip, CDSP, chip->regCDSP); vx_inb(chip, CDSP); msleep(1); } /* * vx_load_xilinx_binary - load the xilinx binary image * the binary image is the binary array converted from the bitstream file. */ static int vxp_load_xilinx_binary(struct vx_core *_chip, const struct firmware *fw) { struct snd_vxpocket *chip = (struct snd_vxpocket *)_chip; unsigned int i; int c; int regCSUER, regRUER; unsigned char *image; unsigned char data; /* Switch to programmation mode */ chip->regDIALOG |= VXP_DLG_XILINX_REPROG_MASK; vx_outb(chip, DIALOG, chip->regDIALOG); /* Save register CSUER and RUER */ regCSUER = vx_inb(chip, CSUER); regRUER = vx_inb(chip, RUER); /* reset HF0 and HF1 */ vx_outb(chip, ICR, 0); /* Wait for answer HF2 equal to 1 */ snd_printdd(KERN_DEBUG "check ISR_HF2\n"); if (vx_check_isr(_chip, ISR_HF2, ISR_HF2, 20) < 0) goto _error; /* set HF1 for loading xilinx binary */ vx_outb(chip, ICR, ICR_HF1); image = fw->data; for (i = 0; i < fw->size; i++, image++) { data = *image; if (vx_wait_isr_bit(_chip, ISR_TX_EMPTY) < 0) goto _error; vx_outb(chip, TXL, data); /* wait for reading */ if (vx_wait_for_rx_full(_chip) < 0) goto _error; c = vx_inb(chip, RXL); if (c != (int)data) snd_printk(KERN_ERR "vxpocket: load xilinx mismatch at %d: 0x%x != 0x%x\n", i, c, (int)data); } /* reset HF1 */ vx_outb(chip, ICR, 0); /* wait for HF3 */ if (vx_check_isr(_chip, ISR_HF3, ISR_HF3, 20) < 0) goto _error; /* read the number of bytes received */ if (vx_wait_for_rx_full(_chip) < 0) goto _error; c = (int)vx_inb(chip, RXH) << 16; c |= (int)vx_inb(chip, RXM) << 8; c |= vx_inb(chip, RXL); snd_printdd(KERN_DEBUG "xilinx: dsp size received 0x%x, orig 0x%x\n", c, fw->size); vx_outb(chip, ICR, ICR_HF0); /* TEMPO 250ms : wait until Xilinx is downloaded */ msleep(300); /* test magical word */ if (vx_check_magic(_chip) < 0) goto _error; /* Restore register 0x0E and 0x0F (thus replacing COR and FCSR) */ vx_outb(chip, CSUER, regCSUER); vx_outb(chip, RUER, regRUER); /* Reset the Xilinx's signal enabling IO access */ chip->regDIALOG |= VXP_DLG_XILINX_REPROG_MASK; vx_outb(chip, DIALOG, chip->regDIALOG); vx_inb(chip, DIALOG); msleep(10); chip->regDIALOG &= ~VXP_DLG_XILINX_REPROG_MASK; vx_outb(chip, DIALOG, chip->regDIALOG); vx_inb(chip, DIALOG); /* Reset of the Codec */ vxp_reset_codec(_chip); vx_reset_dsp(_chip); return 0; _error: vx_outb(chip, CSUER, regCSUER); vx_outb(chip, RUER, regRUER); chip->regDIALOG &= ~VXP_DLG_XILINX_REPROG_MASK; vx_outb(chip, DIALOG, chip->regDIALOG); return -EIO; } /* * vxp_load_dsp - load_dsp callback */ static int vxp_load_dsp(struct vx_core *vx, int index, const struct firmware *fw) { int err; switch (index) { case 0: /* xilinx boot */ if ((err = vx_check_magic(vx)) < 0) return err; if ((err = snd_vx_load_boot_image(vx, fw)) < 0) return err; return 0; case 1: /* xilinx image */ return vxp_load_xilinx_binary(vx, fw); case 2: /* DSP boot */ return snd_vx_dsp_boot(vx, fw); case 3: /* DSP image */ return snd_vx_dsp_load(vx, fw); default: snd_BUG(); return -EINVAL; } } /* * vx_test_and_ack - test and acknowledge interrupt * * called from irq hander, too * * spinlock held! */ static int vxp_test_and_ack(struct vx_core *_chip) { struct snd_vxpocket *chip = (struct snd_vxpocket *)_chip; /* not booted yet? */ if (! (_chip->chip_status & VX_STAT_XILINX_LOADED)) return -ENXIO; if (! (vx_inb(chip, DIALOG) & VXP_DLG_MEMIRQ_MASK)) return -EIO; /* ok, interrupts generated, now ack it */ /* set ACQUIT bit up and down */ vx_outb(chip, DIALOG, chip->regDIALOG | VXP_DLG_ACK_MEMIRQ_MASK); /* useless read just to spend some time and maintain * the ACQUIT signal up for a while ( a bus cycle ) */ vx_inb(chip, DIALOG); vx_outb(chip, DIALOG, chip->regDIALOG & ~VXP_DLG_ACK_MEMIRQ_MASK); return 0; } /* * vx_validate_irq - enable/disable IRQ */ static void vxp_validate_irq(struct vx_core *_chip, int enable) { struct snd_vxpocket *chip = (struct snd_vxpocket *)_chip; /* Set the interrupt enable bit to 1 in CDSP register */ if (enable) chip->regCDSP |= VXP_CDSP_VALID_IRQ_MASK; else chip->regCDSP &= ~VXP_CDSP_VALID_IRQ_MASK; vx_outb(chip, CDSP, chip->regCDSP); } /* * vx_setup_pseudo_dma - set up the pseudo dma read/write mode. * @do_write: 0 = read, 1 = set up for DMA write */ static void vx_setup_pseudo_dma(struct vx_core *_chip, int do_write) { struct snd_vxpocket *chip = (struct snd_vxpocket *)_chip; /* Interrupt mode and HREQ pin enabled for host transmit / receive data transfers */ vx_outb(chip, ICR, do_write ? ICR_TREQ : ICR_RREQ); /* Reset the pseudo-dma register */ vx_inb(chip, ISR); vx_outb(chip, ISR, 0); /* Select DMA in read/write transfer mode and in 16-bit accesses */ chip->regDIALOG |= VXP_DLG_DMA16_SEL_MASK; chip->regDIALOG |= do_write ? VXP_DLG_DMAWRITE_SEL_MASK : VXP_DLG_DMAREAD_SEL_MASK; vx_outb(chip, DIALOG, chip->regDIALOG); } /* * vx_release_pseudo_dma - disable the pseudo-DMA mode */ static void vx_release_pseudo_dma(struct vx_core *_chip) { struct snd_vxpocket *chip = (struct snd_vxpocket *)_chip; /* Disable DMA and 16-bit accesses */ chip->regDIALOG &= ~(VXP_DLG_DMAWRITE_SEL_MASK| VXP_DLG_DMAREAD_SEL_MASK| VXP_DLG_DMA16_SEL_MASK); vx_outb(chip, DIALOG, chip->regDIALOG); /* HREQ pin disabled. */ vx_outb(chip, ICR, 0); } /* * vx_pseudo_dma_write - write bulk data on pseudo-DMA mode * @count: data length to transfer in bytes * * data size must be aligned to 6 bytes to ensure the 24bit alignment on DSP. * NB: call with a certain lock! */ static void vxp_dma_write(struct vx_core *chip, struct snd_pcm_runtime *runtime, struct vx_pipe *pipe, int count) { long port = vxp_reg_addr(chip, VX_DMA); int offset = pipe->hw_ptr; unsigned short *addr = (unsigned short *)(runtime->dma_area + offset); vx_setup_pseudo_dma(chip, 1); if (offset + count > pipe->buffer_bytes) { int length = pipe->buffer_bytes - offset; count -= length; length >>= 1; /* in 16bit words */ /* Transfer using pseudo-dma. */ while (length-- > 0) { outw(cpu_to_le16(*addr), port); addr++; } addr = (unsigned short *)runtime->dma_area; pipe->hw_ptr = 0; } pipe->hw_ptr += count; count >>= 1; /* in 16bit words */ /* Transfer using pseudo-dma. */ while (count-- > 0) { outw(cpu_to_le16(*addr), port); addr++; } vx_release_pseudo_dma(chip); } /* * vx_pseudo_dma_read - read bulk data on pseudo DMA mode * @offset: buffer offset in bytes * @count: data length to transfer in bytes * * the read length must be aligned to 6 bytes, as well as write. * NB: call with a certain lock! */ static void vxp_dma_read(struct vx_core *chip, struct snd_pcm_runtime *runtime, struct vx_pipe *pipe, int count) { struct snd_vxpocket *pchip = (struct snd_vxpocket *)chip; long port = vxp_reg_addr(chip, VX_DMA); int offset = pipe->hw_ptr; unsigned short *addr = (unsigned short *)(runtime->dma_area + offset); snd_assert(count % 2 == 0, return); vx_setup_pseudo_dma(chip, 0); if (offset + count > pipe->buffer_bytes) { int length = pipe->buffer_bytes - offset; count -= length; length >>= 1; /* in 16bit words */ /* Transfer using pseudo-dma. */ while (length-- > 0) *addr++ = le16_to_cpu(inw(port)); addr = (unsigned short *)runtime->dma_area; pipe->hw_ptr = 0; } pipe->hw_ptr += count; count >>= 1; /* in 16bit words */ /* Transfer using pseudo-dma. */ while (count-- > 1) *addr++ = le16_to_cpu(inw(port)); /* Disable DMA */ pchip->regDIALOG &= ~VXP_DLG_DMAREAD_SEL_MASK; vx_outb(chip, DIALOG, pchip->regDIALOG); /* Read the last word (16 bits) */ *addr = le16_to_cpu(inw(port)); /* Disable 16-bit accesses */ pchip->regDIALOG &= ~VXP_DLG_DMA16_SEL_MASK; vx_outb(chip, DIALOG, pchip->regDIALOG); /* HREQ pin disabled. */ vx_outb(chip, ICR, 0); } /* * write a codec data (24bit) */ static void vxp_write_codec_reg(struct vx_core *chip, int codec, unsigned int data) { int i; /* Activate access to the corresponding codec register */ if (! codec) vx_inb(chip, LOFREQ); else vx_inb(chip, CODEC2); /* We have to send 24 bits (3 x 8 bits). Start with most signif. Bit */ for (i = 0; i < 24; i++, data <<= 1) vx_outb(chip, DATA, ((data & 0x800000) ? VX_DATA_CODEC_MASK : 0)); /* Terminate access to codec registers */ vx_inb(chip, HIFREQ); } /* * vx_set_mic_boost - set mic boost level (on vxp440 only) * @boost: 0 = 20dB, 1 = +38dB */ void vx_set_mic_boost(struct vx_core *chip, int boost) { struct snd_vxpocket *pchip = (struct snd_vxpocket *)chip; unsigned long flags; if (chip->chip_status & VX_STAT_IS_STALE) return; spin_lock_irqsave(&chip->lock, flags); if (pchip->regCDSP & P24_CDSP_MICS_SEL_MASK) { if (boost) { /* boost: 38 dB */ pchip->regCDSP &= ~P24_CDSP_MIC20_SEL_MASK; pchip->regCDSP |= P24_CDSP_MIC38_SEL_MASK; } else { /* minimum value: 20 dB */ pchip->regCDSP |= P24_CDSP_MIC20_SEL_MASK; pchip->regCDSP &= ~P24_CDSP_MIC38_SEL_MASK; } vx_outb(chip, CDSP, pchip->regCDSP); } spin_unlock_irqrestore(&chip->lock, flags); } /* * remap the linear value (0-8) to the actual value (0-15) */ static int vx_compute_mic_level(int level) { switch (level) { case 5: level = 6 ; break; case 6: level = 8 ; break; case 7: level = 11; break; case 8: level = 15; break; default: break ; } return level; } /* * vx_set_mic_level - set mic level (on vxpocket only) * @level: the mic level = 0 - 8 (max) */ void vx_set_mic_level(struct vx_core *chip, int level) { struct snd_vxpocket *pchip = (struct snd_vxpocket *)chip; unsigned long flags; if (chip->chip_status & VX_STAT_IS_STALE) return; spin_lock_irqsave(&chip->lock, flags); if (pchip->regCDSP & VXP_CDSP_MIC_SEL_MASK) { level = vx_compute_mic_level(level); vx_outb(chip, MICRO, level); } spin_unlock_irqrestore(&chip->lock, flags); } /* * change the input audio source */ static void vxp_change_audio_source(struct vx_core *_chip, int src) { struct snd_vxpocket *chip = (struct snd_vxpocket *)_chip; switch (src) { case VX_AUDIO_SRC_DIGITAL: chip->regCDSP |= VXP_CDSP_DATAIN_SEL_MASK; vx_outb(chip, CDSP, chip->regCDSP); break; case VX_AUDIO_SRC_LINE: chip->regCDSP &= ~VXP_CDSP_DATAIN_SEL_MASK; if (_chip->type == VX_TYPE_VXP440) chip->regCDSP &= ~P24_CDSP_MICS_SEL_MASK; else chip->regCDSP &= ~VXP_CDSP_MIC_SEL_MASK; vx_outb(chip, CDSP, chip->regCDSP); break; case VX_AUDIO_SRC_MIC: chip->regCDSP &= ~VXP_CDSP_DATAIN_SEL_MASK; /* reset mic levels */ if (_chip->type == VX_TYPE_VXP440) { chip->regCDSP &= ~P24_CDSP_MICS_SEL_MASK; if (chip->mic_level) chip->regCDSP |= P24_CDSP_MIC38_SEL_MASK; else chip->regCDSP |= P24_CDSP_MIC20_SEL_MASK; vx_outb(chip, CDSP, chip->regCDSP); } else { chip->regCDSP |= VXP_CDSP_MIC_SEL_MASK; vx_outb(chip, CDSP, chip->regCDSP); vx_outb(chip, MICRO, vx_compute_mic_level(chip->mic_level)); } break; } } /* * change the clock source * source = INTERNAL_QUARTZ or UER_SYNC */ static void vxp_set_clock_source(struct vx_core *_chip, int source) { struct snd_vxpocket *chip = (struct snd_vxpocket *)_chip; if (source == INTERNAL_QUARTZ) chip->regCDSP &= ~VXP_CDSP_CLOCKIN_SEL_MASK; else chip->regCDSP |= VXP_CDSP_CLOCKIN_SEL_MASK; vx_outb(chip, CDSP, chip->regCDSP); } /* * reset the board */ static void vxp_reset_board(struct vx_core *_chip, int cold_reset) { struct snd_vxpocket *chip = (struct snd_vxpocket *)_chip; chip->regCDSP = 0; chip->regDIALOG = 0; } /* * callbacks */ /* exported */ struct snd_vx_ops snd_vxpocket_ops = { .in8 = vxp_inb, .out8 = vxp_outb, .test_and_ack = vxp_test_and_ack, .validate_irq = vxp_validate_irq, .write_codec = vxp_write_codec_reg, .reset_codec = vxp_reset_codec, .change_audio_source = vxp_change_audio_source, .set_clock_source = vxp_set_clock_source, .load_dsp = vxp_load_dsp, .add_controls = vxp_add_mic_controls, .reset_dsp = vxp_reset_dsp, .reset_board = vxp_reset_board, .dma_write = vxp_dma_write, .dma_read = vxp_dma_read, };
Phaeochromocytoma: current concepts The discovery of novel mutations in genes encoding succinate dehydrogenase subunits has revealed that familial phaeochromocytomas are much more common than previously thought. Genetic screening should be offered to patients with apparently sporadic phaeochromocytomas and their first‐degree relatives. An increasing proportion of phaeochromocytomas present preclinically on genetic testing or as “incidentalomas” on abdominal imaging, rather than with classic symptoms and signs. Clinical suspicion should prompt measurement of plasma levels of free metanephrine or 24‐hour urinary catecholamine and metanephrine levels, followed, if positive, by tumour localisation studies. With appropriate perioperative care, surgical management of phaeochromocytomas is safe and effective. Most tumours can be removed laparoscopically.
import {NgModule} from '@angular/core'; import {CommonModule} from '@angular/common'; import {OrdersService} from './services/orders.service'; import {TaxService} from './services/tax.service'; import {OrderModule} from './pages/order-page/order.module'; import {NewOrderModule} from './pages/new-order-page/new-order.module'; @NgModule({ declarations: [], imports: [ CommonModule, OrderModule, NewOrderModule ], providers: [OrdersService, TaxService], bootstrap: [] }) export class OrdersModule { }
/** * Undertakes linking to a {@link ActivityProcedureModel}. * * @param procedureIndex {@link ActivityProcedureNextModel} index. */ private void doLinkToProcedure(int procedureIndex) { ActivityProcedureNextModel procedureNext = this.model.getActivityProcedures().get(procedureIndex).getNext(); ActivityProcedureModel procedure = this.model.getActivityProcedures().get(1); Change<ActivityProcedureNextToActivityProcedureModel> change = this.operations .linkProcedureNextToProcedure(procedureNext, procedure); this.assertChange(change, null, "Link Procedure Next to Procedure", true); }
/* AOJ 1137 Numeral System 2015/5/21 */ #include<stdio.h> char output_str[10] = {'\0'}; int mcxi_num(char *input_str) { int i, base, num, sum = 0; for(i = 0; input_str[i] != '\0'; i++) { if(input_str[i] > 60) { switch(input_str[i]) { case 'm': sum += 1000; break; case 'c': sum += 100; break; case 'x': sum += 10; break; case 'i': sum += 1; break; } } else { num = input_str[i] - '0'; switch(input_str[++i]) { case 'm': base = 1000; break; case 'c': base = 100; break; case 'x': base = 10; break; case 'i': base = 1; break; } sum += num * base; } } return sum; } void num_mcxi(int num) { int i = 0, j, div, rest = num; char inp_ch; for(j = 1000; j > 0; j = j / 10) { div = rest / j; rest = rest % j; if(div == 0) { continue; } else if(div == 1) { switch(j) { case 1000: output_str[i++] = 'm'; break; case 100: output_str[i++] = 'c'; break; case 10: output_str[i++] = 'x'; break; case 1: output_str[i++] = 'i'; break; } } else { switch(j) { case 1000: output_str[i++] = div + '0'; output_str[i++] = 'm'; break; case 100: output_str[i++] = div + '0'; output_str[i++] = 'c'; break; case 10: output_str[i++] = div + '0'; output_str[i++] = 'x'; break; case 1: output_str[i++] = div + '0'; output_str[i++] = 'i'; break; } } } output_str[i] = '\0'; } int main(void) { int num, i, sum = 0; char first[10], second[10]; scanf("%d", &num); for(i = 0; i < num; i++) { scanf("%s %s", first, second); // scanf("%d", &sum); // num_mcxi(sum); // printf("%d\n", mcxi_num(first) + mcxi_num(second)); num_mcxi(mcxi_num(first) + mcxi_num(second)); printf("%s\n", output_str); } return 0; }
// NewJingoCommand creates the root jingo command func NewJingoCommand() *cobra.Command { cmds := &cobra.Command{} return cmds }
// Do runs exechook.command, implements Hook.Do func (c *Exechook) Do(ctx context.Context, hash string) error { ctx, cancel := context.WithTimeout(ctx, c.timeout) defer cancel() worktreePath := filepath.Join(c.gitRoot, hash) c.logger.V(0).Info("running exechook", "command", c.command, "timeout", c.timeout) _, err := c.cmdrunner.Run(ctx, worktreePath, c.command, c.args...) return err }
package io.picos.webhookee.incoming.bitbucket; import com.fasterxml.jackson.annotation.JsonProperty; /** * @auther dz */ public class BitBucketIssue { private String id; private String component; private String title; private BitBucketContent content; private String priority; private String assignee; private String state; private String type; private String milestone; private String version; @JsonProperty("created_on") private String createdOn; @JsonProperty("updated_on") private String updatedOn; private BitBucketLink links; public String getId() { return id; } public void setId(String id) { this.id = id; } public String getComponent() { return component; } public void setComponent(String component) { this.component = component; } public String getTitle() { return title; } public void setTitle(String title) { this.title = title; } public BitBucketContent getContent() { return content; } public void setContent(BitBucketContent content) { this.content = content; } public String getPriority() { return priority; } public void setPriority(String priority) { this.priority = priority; } public String getAssignee() { return assignee; } public void setAssignee(String assignee) { this.assignee = assignee; } public String getState() { return state; } public void setState(String state) { this.state = state; } public String getType() { return type; } public void setType(String type) { this.type = type; } public String getMilestone() { return milestone; } public void setMilestone(String milestone) { this.milestone = milestone; } public String getVersion() { return version; } public void setVersion(String version) { this.version = version; } public String getCreatedOn() { return createdOn; } public void setCreatedOn(String createdOn) { this.createdOn = createdOn; } public String getUpdatedOn() { return updatedOn; } public void setUpdatedOn(String updatedOn) { this.updatedOn = updatedOn; } public BitBucketLink getLinks() { return links; } public void setLinks(BitBucketLink links) { this.links = links; } }
declare var jest import { LeveledLogMethod } from 'winston' export default () => ({ info: jest.fn() as LeveledLogMethod, warn: jest.fn() as LeveledLogMethod, error: jest.fn() as LeveledLogMethod, debug: jest.fn() as LeveledLogMethod, })
Mama Ru and Mother Monster have come to werk! Lady Gaga is helping RuPaul kick off the ninth season of RuPaul’s Drag Race and PEOPLE has the exclusive first look at the singer’s cameo. Get push notifications with news, features and more. The Emmy-winning show is making the jump from Logo to VH1, so the network will be throwing a live viewing party on Friday nights. Wendy Williams will host the bash — dubbed Fierce Fridays — and RuPaul’s Drag Race judge Ross Mathews will join in to dish on the queens ahead of the show’s premiere. The viewing party kicks off March 24 at 5:30 PM on VH1 during the Pitch Perfect broadcast. Source: VH1 In the premiere, 13 queens who are vying for the title of America’s Next Drag Superstar are introduced to Gaga for the first time. The shock and awe of Mother Monster’s presence leaves many in tears, with one contestant sharing an emotional story of how impactful Gaga has been in their lives. “I have always really admired the craftsmanship that goes into what you all do,” this year’s Super Bowl halftime show performer, 30, tells the competitors. “Drag for me has been an opportunity to leave myself when I didn’t want to be me and felt so completely out of place when I was in high school. Drag has been a part of my life for the longest time. It’s just really an honor for me to be here, to be with you.” Not since ABC’s 2013 Thanksgiving television special, Lady Gaga and the Muppets Holiday Spectacular, have RuPaul and Gaga been together on screen. FROM COINAGE: This Is How Long You’ll Have to Work to Make as Much Money as Rihanna’s Hit ‘Work’ “There’s an ease and familiarity that is apparent when you watch her,” RuPaul, who recently revealed he married his longtime partner, told Entertainment Weekly of Gaga’s Drag Race stint. “We had a fabulous time with Gaga. She was in heaven because this is all so much a part of her wheelhouse and what she does with imagery, costumes, and the way she presents herself.” Other celebrities making appearances this season include Lisa Kudrow, Cheyenne Jackson, Meghan Trainor, Kesha, The B-52s, Naya Rivera, Andie MacDowell, Tori Spelling, Jennie Garth, Denis O’Hare, Noah Galvin, Todrick Hall, Tamar Braxton, Lisa Robertson, Joan Smalls, Candis Cayne, Fortune Feimster, and Jeffrey Bowyer-Chapman. RuPaul’s Drag Race premieres Friday, March 24, at 8 p.m. ET on VH1.
/** * Tests of the methods on an {@link FDBRecordStore} around estimating the size of a store. * * <p> * Note that as the guarantees of the "estimate size" APIs are relatively weak, these tests mostly assert that the * estimate happened without error, though they can occasionally make more sophisticated assertions like "this estimate * should be less than this other estimate". * </p> */ @Tag(Tags.RequiresFDB) public class FDBRecordStoreEstimateSizeTest extends FDBRecordStoreTestBase { private static final Logger LOGGER = LoggerFactory.getLogger(FDBRecordStoreEstimateSizeTest.class); @Test public void estimatedSize() throws Exception { populateStore(1_000, 1_000_000L); try (FDBRecordContext context = openContext()) { openSimpleRecordStore(context); long estimatedStoreSize = estimateStoreSize(recordStore); long exactStoreSize = getExactStoreSize(recordStore); LOGGER.info(KeyValueLogMessage.of("calculated estimated store size", "estimated", estimatedStoreSize, "exact", exactStoreSize)); long estimatedRecordsSize = estimateRecordsSize(recordStore); long exactRecordsSize = getExactRecordsSize(recordStore); LOGGER.info(KeyValueLogMessage.of("calculated estimated records size", "estimated", estimatedRecordsSize, "exact", exactRecordsSize)); assertThat("estimated record size should be less than estimated store size", estimatedRecordsSize, lessThanOrEqualTo(estimatedStoreSize)); commit(context); // commit as that logs the store timer stats } } @Disabled("too expensive but useful for smoke-checking the performance on large stores") @Test public void estimateLargeStore() throws Exception { populateStore(200_000, 1_000_000_000L); try (FDBRecordContext context = openContext()) { openSimpleRecordStore(context); long estimatedStoreSize = estimateStoreSize(recordStore); long estimatedRecordsSize = estimateRecordsSize(recordStore); LOGGER.info(KeyValueLogMessage.of("calculated estimated record size on 1 GB store", "estimated_store_size", estimatedStoreSize, "estimated_records_size", estimatedRecordsSize)); assertThat(estimatedRecordsSize, lessThanOrEqualTo(estimatedStoreSize)); commit(context); } } @Test public void estimateEmptyStore() throws Exception { try (FDBRecordContext context = openContext()) { openSimpleRecordStore(context); long estimatedStoreSize = estimateStoreSize(recordStore); long estimatedRecordsSize = estimateRecordsSize(recordStore); LOGGER.info(KeyValueLogMessage.of("estimated size of empty store", "estimated_store_size", estimatedStoreSize, "estimated_record_size", estimatedRecordsSize)); assertThat(estimatedRecordsSize, lessThanOrEqualTo(estimatedStoreSize)); commit(context); } } @Test public void estimateSingleRecord() throws Exception { populateStore(1, 5_000L); try (FDBRecordContext context = openContext()) { openSimpleRecordStore(context); long estimatedStoreSize = estimateStoreSize(recordStore); long estimatedRecordsSize = estimateRecordsSize(recordStore); LOGGER.info(KeyValueLogMessage.of("estimated size of store with a single record", "estimated_store_size", estimatedStoreSize, "estimated_record_size", estimatedRecordsSize)); assertThat(estimatedRecordsSize, lessThanOrEqualTo(estimatedStoreSize)); commit(context); } } @Test public void estimateInTwoRanges() throws Exception { final int recordCount = 500; populateStore(recordCount, recordCount * 5_000); try (FDBRecordContext context = openContext()) { openSimpleRecordStore(context); long estimatedRecordSize = estimateRecordsSize(recordStore); final Tuple halfWayTuple = Tuple.from(recordCount / 2); final TupleRange firstHalf = new TupleRange(null, halfWayTuple, EndpointType.TREE_START, EndpointType.RANGE_EXCLUSIVE); long estimatedFirstHalf = estimateRecordsSize(recordStore, firstHalf); final TupleRange secondHalf = new TupleRange(halfWayTuple, null, EndpointType.RANGE_INCLUSIVE, EndpointType.TREE_END); long estimatedSecondHalf = estimateRecordsSize(recordStore, secondHalf); LOGGER.info(KeyValueLogMessage.of("estimated both halves of records", "estimated_records_size", estimatedRecordSize, "estimated_first_half_size", estimatedFirstHalf, "estimated_second_half_size", estimatedSecondHalf)); assertEquals(estimatedFirstHalf + estimatedSecondHalf, estimatedRecordSize, "expected first half size (" + estimatedFirstHalf + ") and second half size (" + estimatedSecondHalf + ") to match full size"); commit(context); } } private static long estimateStoreSize(@Nonnull FDBRecordStore store) { return store.getRecordContext().asyncToSync(FDBStoreTimer.Waits.WAIT_ESTIMATE_SIZE, store.estimateStoreSizeAsync()); } private static long estimateRecordsSize(@Nonnull FDBRecordStore store) { return store.getRecordContext().asyncToSync(FDBStoreTimer.Waits.WAIT_ESTIMATE_SIZE, store.estimateRecordsSizeAsync()); } private static long estimateRecordsSize(@Nonnull FDBRecordStore store, @Nonnull TupleRange range) { return store.getRecordContext().asyncToSync(FDBStoreTimer.Waits.WAIT_ESTIMATE_SIZE, store.estimateRecordsSizeAsync(range)); } private static long getExactStoreSize(@Nonnull FDBRecordStore store) { SizeStatisticsCollectorCursor statsCursor = SizeStatisticsCollectorCursor.ofStore(store, store.getContext(), ScanProperties.FORWARD_SCAN, null); return getTotalSize(statsCursor); } private static long getExactRecordsSize(@Nonnull FDBRecordStore store) { SizeStatisticsCollectorCursor statsCursor = SizeStatisticsCollectorCursor.ofRecords(store, store.getContext(), ScanProperties.FORWARD_SCAN, null); return getTotalSize(statsCursor); } private static long getTotalSize(@Nonnull SizeStatisticsCollectorCursor statsCursor) { final RecordCursorResult<SizeStatisticsCollectorCursor.SizeStatisticsResults> result = statsCursor.getNext(); assertTrue(result.hasNext()); return result.get().getTotalSize(); } private void populateStore(int recordCount, long byteCount) throws Exception { int bytesPerRecord = (int)(byteCount / recordCount); final String data = StringUtils.repeat('x', bytesPerRecord); int currentRecord = 0; while (currentRecord < recordCount) { try (FDBRecordContext context = openContext()) { openSimpleRecordStore(context); long transactionSize; do { final Message record = TestRecords1Proto.MySimpleRecord.newBuilder() .setRecNo(currentRecord) .setStrValueIndexed(data) .build(); recordStore.saveRecord(record); transactionSize = context.asyncToSync(FDBStoreTimer.Waits.WAIT_APPROXIMATE_TRANSACTION_SIZE, context.getApproximateTransactionSize()); currentRecord++; } while (currentRecord < recordCount && transactionSize < 1_000_000); commit(context); } } } }
/** * Called when a successful notification is shown. * @param info Pending notification information to be handled. * @param notificationId ID of the notification. */ @VisibleForTesting void onSuccessNotificationShown( final PendingNotificationInfo notificationInfo, final int notificationId) { ThreadUtils.postOnUiThread(new Runnable() { @Override public void run() { DownloadManagerService.getDownloadManagerService( mApplicationContext).onSuccessNotificationShown( notificationInfo.downloadInfo, notificationInfo.canResolve, notificationId, notificationInfo.systemDownloadId); } }); }
Joanna Leigh at home in the Jamaica Plain section of Boston. Leigh suffered a traumatic brain injury from the marathon bombing and is seeking more money from the One Fund. (Matthew Cavanaugh/For The Washington Post) Joanna Leigh describes her life in black and white, before and after. Before the Boston Marathon bombing, she says, she had “just embarked on a really beautiful future” with a new doctoral degree in international development and a career as a consultant. Today, she says, she can’t work or drive and often gets lost, sometimes on her own block. Her vision is blurry, her hearing is diminished and her ears ring constantly. She struggles to cook dinner, do her laundry, fill out a form. Mostly, she sleeps. The cause of her difficulties, according to the physician who examined her, was a traumatic brain injury on April 15. But because Leigh, 39, walked home that day after she was knocked unconscious by the second bomb and never went to a hospital, she received just $8,000 from the One Fund charity for survivors. She said her medical and other expenses have reached $70,000. She is applying for disability payments and food stamps. One Fund payouts to everyone except 16 amputees and the families of the four people who were killed were based on the number of nights spent in the hospital. A single night was worth $125,000; 32 nights qualified victims for $948,000. The 143 people who were treated as outpatients received $8,000 each. In coming days, Leigh and four other attack survivors will petition the One Fund to develop a new plan for distributing the millions of dollars in donations the charity has received since the first payout. They are seeking a formula that takes into account injuries that were slow to reveal themselves. “Rough justice” is what lawyer Kenneth Feinberg has called the system he devised to distribute nearly $61 million, and some have complained about it. J.P. and Paul Norden, brothers from Stoneham, Mass., who lost their right legs in the attack, told The Washington Post last month that their $1.2 million payouts, while generous, will fall short of covering their needs because of the high cost of prostheses and other medical care. They think amputees should receive more. The four other people on the petition besides Leigh have long-term problems with their hearing, said Jeff Stern, Leigh’s attorney. One, Scott Weisberg, a physician in Birmingham, Ala., and one of the few runners injured, said in a brief interview that he has been forced to cut back his hours and is concerned about his ability to do his job. The three others have not been publicly identified. Camille Biros, Feinberg’s deputy, said the plan always anticipated the possibility of additional payments if money is available and they are deemed warranted. “She is taking the right steps. She is going back to the One Fund . . . and now the board of directors can make another determination,” Biros said. Mike Sheehan, the One Fund’s treasurer, said the charity will speak with all survivors before deciding how to spend the more than $10 million contributed since the initial payout. “We have to be careful. We have to be thorough. And we’ll try as best we can to be fair,” he said. Symptoms of traumatic brain injury can take time to emerge. Inflammation of brain tissue can progress and the damage to delicate neural pathways may not be evident immediately, said Erin Bigler, a professor of neuroscience at Brigham Young University in Utah. When there are no visible signs of trauma, laymen can have difficulty understanding why some people suffer disabling problems, he said. But even with increasingly sophisticated tests, the diagnosis is still controversial. “Since the majority who sustain a concussion spontaneously improve and return to baseline-level function, some believe that concussion is benign, and if symptoms persist, it reflects a psychiatric manifestation,” he said. Bigler said cognitive and emotional problems can persist after concussions. Douglas Sheff, president of the Massachusetts Bar Association and a personal-injury lawyer who volunteered to help bombing victims, said the “One Fund is doing the same thing that, frankly, insurance companies and, frankly, doctors are still doing to this day” — not treating brain injuries seriously because they sometimes have no obvious physical symptoms. ‘Not what I was prepared for’ Leigh, who was at the marathon to watch a friend run, was perhaps 10 feet from the second bomb when it detonated, trapped in a thick crowd that may have shielded her from the shrapnel that tore into so many around her. She had been making her way toward the first explosion hoping to help. She knows CPR and has had disaster response training. She said she awoke against a barricade. Debris was fluttering in the air and blood and flesh were everywhere, she said. “It’s just heat and it’s little pin dots,” she said. “I don’t feel my feet and I don’t feel myself going through the air. And I wake up and I’m up against the railing, sitting against the railing, and people were trampling my feet.” An older man, covered in blood, his arm flayed open, staggered toward her. She helped him around a corner, where she thought he’d be safer. Two men brought over another victim. “His foot and ankle were all bloody. The top half of his foot came off in my hand. It was in his sneaker. . . . I vaccinate babies in East Africa. This was not what I was prepared for,” Leigh said. She helped the two injured men into ambulances. Leigh said she tried to find her way home in Boston’s Jamaica Plain section, more than three miles away. Trains were shut down and cabs were not allowed near the bombing site, so eventually, she said, she walked. She remembers very little of the trip. With a doctor’s appointment scheduled for two days later, Leigh decided not to go to a hospital. She spent the first night agitated and the next day dizzy and vomiting when she wasn’t asleep, typical signs of a concussion. But a bomb scare canceled her appointment and the lockdown and manhunt for the suspects on April 19 canceled another. She wasn’t examined until April 24, nine days after the bombing. Since then, she has seen a neurologist, an eye doctor, a hearing specialist, a psychotherapist and a dermatologist for damage to her skin. She has had increasingly sophisticated tests on her brain. She takes a variety of medications to treat headaches and prevent seizures. “The majority of this patient’s severe cognitive, somatic, mood and sleep symptoms that have completely disrupted her life are due to her post-concussion syndrome,” Robert Cantu, Leigh’s doctor and a national expert on the subject, wrote after he evaluated her. “As a result of these symptoms Dr. Leigh is currently and for the foreseeable future disabled from the kind of complex consulting work she has done before the bomb blast injury was sustained.” Leigh also has post-traumatic stress disorder, Cantu wrote. Leigh’s health insurance has paid about $40,000 of her $70,000 in bills, the One Fund gave her $8,000 and a state victims’ fund has contributed $5,000. She is down to $1,200, living on credit cards and money from selling her possessions, she said. She is filing for monthly Social Security disability payments of $473 and $30.40 in food stamps — far less than the $2,300 she owes just for rent and student loans each month, she said. “My brain isn’t prepared to handle a lot of tasks anymore,” Leigh said.“If I try to do a lot of things, it has, like, meltdowns. “I have no money. I have nobody to help me do things. I’m always worried about bills.” Leigh has been rebuffed twice by the One Fund, the first time when she met with Feinberg before the money was distributed and later by another official after the payments were made. Stern, her attorney, is among those who have asked Massachusetts Attorney General Martha Coakley to intervene, and he has stepped up his efforts to get her more money. “Donors to the One Fund, to my knowledge, never intended to have their contributions limited in such a way as to leave victims like my client so severely undercompensated,” he wrote to the charity July 17. Leigh said she will soon give up her health insurance and go on the state program for the poor, which may sharply reduce available treatment. “I don’t ever regret running to help people,” she said, adding, “I’m really saddened that my society hasn’t rallied to help me.”
/*** Given an array nums with n objects colored red, white, or blue, sort them in-place so that objects of the same color are adjacent, with the colors in the order red, white, and blue. Here, we will use the integers 0, 1, and 2 to represent the color red, white, and blue respectively. Follow up: Could you solve this problem without using the library's sort function? Could you come up with a one-pass algorithm using only O(1) constant space? Example 1: Input: nums = [2,0,2,1,1,0] Output: [0,0,1,1,2,2] Example 2: Input: nums = [2,0,1] Output: [0,1,2] Example 3: Input: nums = [0] Output: [0] Example 4: Input: nums = [1] Output: [1] Constraints: n == nums.length 1 <= n <= 300 nums[i] is 0, 1, or 2. */ /** Do not return anything, modify nums in-place instead. */ export function sortColors(nums: number[]): void { let nextZero = 0; // tries not be zero let fetcher = 0; // fetch and deliver to nextZero or nextTwo let nextTwo = nums.length - 1; // tries not to be two while(fetcher <= nextTwo) { while(nums[nextZero] === 0 && fetcher <= nextTwo) { nextZero++; if(fetcher < nextZero) { fetcher = nextZero; } } while(nums[nextTwo] === 2 && fetcher <= nextTwo) { nextTwo--; } if(fetcher <= nextTwo) { if (nums[nextZero] > nums[nextTwo]) { let temp = nums[nextZero]; nums[nextZero] = nums[nextTwo]; nums[nextTwo] = temp; } if (nums[nextZero] == nums[nextTwo]) { if (nums[fetcher] === 0) { nums[nextZero] = 0; nextZero++; nums[fetcher] = 1; fetcher++; } else if (nums[fetcher] === 2) { nums[nextTwo] = 2; nextTwo--; nums[fetcher] = 1; fetcher++; } else { fetcher++; } } } } }
// Copyright 2009-2021 NTESS. Under the terms // of Contract DE-NA0003525 with NTESS, the U.S. // Government retains certain rights in this software. // // Copyright (c) 2009-2021, NTESS // All rights reserved. // // This file is part of the SST software package. For license // information, see the LICENSE file in the top level directory of the // distribution. #include "sst_config.h" #include "sst/core/env/envconfig.h" #include <cstdio> #include <cstdlib> #include <iostream> #include <map> #include <set> #include <string> #include <sys/file.h> #include <vector> // SST::Core::Environment::EnvironmentConfigGroup(std::string gName) { // groupName(gName); //} std::string SST::Core::Environment::EnvironmentConfigGroup::getName() const { return groupName; } std::set<std::string> SST::Core::Environment::EnvironmentConfigGroup::getKeys() const { std::set<std::string> retKeys; for ( auto mapItr = params.begin(); mapItr != params.end(); mapItr++ ) { retKeys.insert(mapItr->first); } return retKeys; } std::string SST::Core::Environment::EnvironmentConfigGroup::getValue(const std::string& key) { return (params.find(key) == params.end()) ? "" : params[key]; } void SST::Core::Environment::EnvironmentConfigGroup::setValue(const std::string& key, const std::string& value) { auto paramsItr = params.find(key); if ( paramsItr != params.end() ) { params.erase(paramsItr); } params.insert(std::pair<std::string, std::string>(key, value)); } void SST::Core::Environment::EnvironmentConfigGroup::print() { printf("# Group: %s ", groupName.c_str()); int remainingLen = 70 - groupName.size(); if ( remainingLen < 0 ) { remainingLen = 0; } for ( ; remainingLen >= 0; remainingLen-- ) { printf("-"); } printf("\n"); for ( auto paramsItr = params.begin(); paramsItr != params.end(); paramsItr++ ) { printf("%s=%s\n", paramsItr->first.c_str(), paramsItr->second.c_str()); } } void SST::Core::Environment::EnvironmentConfigGroup::writeTo(FILE* outFile) { fprintf(outFile, "\n[%s]\n", groupName.c_str()); for ( auto paramsItr = params.begin(); paramsItr != params.end(); paramsItr++ ) { fprintf(outFile, "%s=%s\n", paramsItr->first.c_str(), paramsItr->second.c_str()); } } SST::Core::Environment::EnvironmentConfiguration::EnvironmentConfiguration() {} SST::Core::Environment::EnvironmentConfiguration::~EnvironmentConfiguration() { // Delete all the groups we have created for ( auto groupItr = groups.begin(); groupItr != groups.end(); groupItr++ ) { delete groupItr->second; } } SST::Core::Environment::EnvironmentConfigGroup* SST::Core::Environment::EnvironmentConfiguration::createGroup(const std::string& groupName) { EnvironmentConfigGroup* newGroup = nullptr; if ( groups.find(groupName) == groups.end() ) { newGroup = new EnvironmentConfigGroup(groupName); groups.insert(std::pair<std::string, EnvironmentConfigGroup*>(groupName, newGroup)); } else { newGroup = groups.find(groupName)->second; } return newGroup; } void SST::Core::Environment::EnvironmentConfiguration::removeGroup(const std::string& groupName) { auto theGroup = groups.find(groupName); if ( theGroup != groups.end() ) { groups.erase(theGroup); } } std::set<std::string> SST::Core::Environment::EnvironmentConfiguration::getGroupNames() { std::set<std::string> groupNames; for ( auto groupItr = groups.begin(); groupItr != groups.end(); groupItr++ ) { groupNames.insert(groupItr->first); } return groupNames; } SST::Core::Environment::EnvironmentConfigGroup* SST::Core::Environment::EnvironmentConfiguration::getGroupByName(const std::string& groupName) { return createGroup(groupName); } void SST::Core::Environment::EnvironmentConfiguration::print() { for ( auto groupItr = groups.begin(); groupItr != groups.end(); groupItr++ ) { groupItr->second->print(); } } void SST::Core::Environment::EnvironmentConfiguration::writeTo(const std::string& filePath) { FILE* output = fopen(filePath.c_str(), "w+"); if ( nullptr == output ) { fprintf(stderr, "Unable to open file: %s\n", filePath.c_str()); exit(-1); } const int outputFD = fileno(output); // Lock the file because we are going to write it out (this should be // exclusive since no one else should muck with it) flock(outputFD, LOCK_EX); for ( auto groupItr = groups.begin(); groupItr != groups.end(); groupItr++ ) { groupItr->second->writeTo(output); } flock(outputFD, LOCK_UN); fclose(output); } void SST::Core::Environment::EnvironmentConfiguration::writeTo(FILE* output) { for ( auto groupItr = groups.begin(); groupItr != groups.end(); groupItr++ ) { groupItr->second->writeTo(output); } }
<gh_stars>10-100 package endure import ( "reflect" "strings" ) func removePointerAsterisk(s string) string { return strings.Trim(s, "*") } func isReference(t reflect.Type) bool { return t.Kind() == reflect.Ptr } // Handle all primitive (basic) types func isPrimitive(str string) bool { switch str { case "bool": return true case "string": return true case "int": return true case "int8": return true case "int16": return true case "int32": return true case "int64": return true case "uint": return true case "uint8": return true case "uint16": return true case "uint32": return true case "uint64": return true case "uintptr": return true case "byte": return true case "rune": return true case "float32": return true case "float64": return true case "complex64": return true case "complex128": return true default: return false } }
#include "hnswlib.hpp" namespace hnswlib { static float L2Sqr(const void *pVect1, const void *pVect2, const void *qty_ptr) { //return *((float *)pVect2); size_t qty = *((size_t *) qty_ptr); float res = 0; for (unsigned i = 0; i < qty; i++) { float t = ((float *) pVect1)[i] - ((float *) pVect2)[i]; res += t * t; } return (res); } static float L2SqrSIMD16Ext(const void *pVect1v, const void *pVect2v, const void *qty_ptr) { float *pVect1 = (float *) pVect1v; float *pVect2 = (float *) pVect2v; size_t qty = *((size_t *) qty_ptr); float PORTABLE_ALIGN32 TmpRes[8]; #ifdef __AVX__ size_t qty16 = qty >> 4; const float *pEnd1 = pVect1 + (qty16 << 4); __m256 diff, v1, v2; __m256 sum = _mm256_set1_ps(0); while (pVect1 < pEnd1) { v1 = _mm256_loadu_ps(pVect1); pVect1 += 8; v2 = _mm256_loadu_ps(pVect2); pVect2 += 8; diff = _mm256_sub_ps(v1, v2); sum = _mm256_add_ps(sum, _mm256_mul_ps(diff, diff)); v1 = _mm256_loadu_ps(pVect1); pVect1 += 8; v2 = _mm256_loadu_ps(pVect2); pVect2 += 8; diff = _mm256_sub_ps(v1, v2); sum = _mm256_add_ps(sum, _mm256_mul_ps(diff, diff)); } _mm256_store_ps(TmpRes, sum); float res = TmpRes[0] + TmpRes[1] + TmpRes[2] + TmpRes[3] + TmpRes[4] + TmpRes[5] + TmpRes[6] + TmpRes[7]; return (res); #else // size_t qty4 = qty >> 2; size_t qty16 = qty >> 4; const float *pEnd1 = pVect1 + (qty16 << 4); // const float* pEnd2 = pVect1 + (qty4 << 2); // const float* pEnd3 = pVect1 + qty; __m128 diff, v1, v2; __m128 sum = _mm_set1_ps(0); while (pVect1 < pEnd1) { //_mm_prefetch((char*)(pVect2 + 16), _MM_HINT_T0); v1 = _mm_loadu_ps(pVect1); pVect1 += 4; v2 = _mm_loadu_ps(pVect2); pVect2 += 4; diff = _mm_sub_ps(v1, v2); sum = _mm_add_ps(sum, _mm_mul_ps(diff, diff)); v1 = _mm_loadu_ps(pVect1); pVect1 += 4; v2 = _mm_loadu_ps(pVect2); pVect2 += 4; diff = _mm_sub_ps(v1, v2); sum = _mm_add_ps(sum, _mm_mul_ps(diff, diff)); v1 = _mm_loadu_ps(pVect1); pVect1 += 4; v2 = _mm_loadu_ps(pVect2); pVect2 += 4; diff = _mm_sub_ps(v1, v2); sum = _mm_add_ps(sum, _mm_mul_ps(diff, diff)); v1 = _mm_loadu_ps(pVect1); pVect1 += 4; v2 = _mm_loadu_ps(pVect2); pVect2 += 4; diff = _mm_sub_ps(v1, v2); sum = _mm_add_ps(sum, _mm_mul_ps(diff, diff)); } _mm_store_ps(TmpRes, sum); float res = TmpRes[0] + TmpRes[1] + TmpRes[2] + TmpRes[3]; return (res); #endif } static float L2SqrSIMD4Ext(const void *pVect1v, const void *pVect2v, const void *qty_ptr) { float PORTABLE_ALIGN32 TmpRes[8]; float *pVect1 = (float *) pVect1v; float *pVect2 = (float *) pVect2v; size_t qty = *((size_t *) qty_ptr); // size_t qty4 = qty >> 2; size_t qty16 = qty >> 2; const float *pEnd1 = pVect1 + (qty16 << 2); __m128 diff, v1, v2; __m128 sum = _mm_set1_ps(0); while (pVect1 < pEnd1) { v1 = _mm_loadu_ps(pVect1); pVect1 += 4; v2 = _mm_loadu_ps(pVect2); pVect2 += 4; diff = _mm_sub_ps(v1, v2); sum = _mm_add_ps(sum, _mm_mul_ps(diff, diff)); } _mm_store_ps(TmpRes, sum); float res = TmpRes[0] + TmpRes[1] + TmpRes[2] + TmpRes[3]; return (res); } L2Space::L2Space(size_t dim) { fstdistfunc_ = L2Sqr; if (dim % 4 == 0) fstdistfunc_ = L2SqrSIMD4Ext; if (dim % 16 == 0) fstdistfunc_ = L2SqrSIMD16Ext; /*else{ throw runtime_error("Data type not supported!"); }*/ dim_ = dim; data_size_ = dim * sizeof(float); } size_t L2Space::get_data_size() { return data_size_; } DISTFUNC<float> L2Space::get_dist_func() { return fstdistfunc_; } void *L2Space::get_dist_func_param() { return &dim_; } static int L2SqrI(const void *__restrict pVect1, const void *__restrict pVect2, const void *__restrict qty_ptr) { size_t qty = *((size_t *) qty_ptr); int res = 0; unsigned char *a = (unsigned char *) pVect1; unsigned char *b = (unsigned char *) pVect2; /*for (int i = 0; i < qty; i++) { int t = int((a)[i]) - int((b)[i]); res += t*t; }*/ qty = qty >> 2; for (size_t i = 0; i < qty; i++) { res += ((*a) - (*b)) * ((*a) - (*b)); a++; b++; res += ((*a) - (*b)) * ((*a) - (*b)); a++; b++; res += ((*a) - (*b)) * ((*a) - (*b)); a++; b++; res += ((*a) - (*b)) * ((*a) - (*b)); a++; b++; } return (res); } L2SpaceI::L2SpaceI(size_t dim) { fstdistfunc_ = L2SqrI; dim_ = dim; data_size_ = dim * sizeof(unsigned char); } size_t L2SpaceI::get_data_size() { return data_size_; } DISTFUNC<int> L2SpaceI::get_dist_func() { return fstdistfunc_; } void * L2SpaceI::get_dist_func_param() { return &dim_; } }
/** For testing: replace dest with src. (Dest must have a refcount * of 1) */ void crypto_pk_assign_(crypto_pk_t *dest, const crypto_pk_t *src) { tor_assert(dest); tor_assert(dest->refs == 1); tor_assert(src); RSA_free(dest->key); dest->key = RSAPrivateKey_dup(src->key); }
Final Fantasy XV, World of Final Fantasy set for Taipei Game Show this month Update: The Final Fantasy XV presentation is confirmed not to be the January 2016 Active Time Report in a tweet from the game’s official Japanese Twitter profile. Additionally, the tweet also states that the contents from the Final Fantasy XV presentation will differ from the contents for the Active Time Report. Original story below as follows Sony Computer Entertainment Taiwan today revealed plans for Taipei Game Show – which is set to run from January 29th through February 2nd this year – and include a selection of stage events from popular overseas creators. Among them, director Hajime Tabata will be showing off a bit of Final Fantasy XV while producer Shinji Hashimoto will take part in a game demonstration along with director Hiroki Chiba for World of Final Fantasy. On January 31st both games will be on display at the PlayStation booth starting from 11:00 local time (10PM EST on January 30th) where Tabata will kick off his hour-long presentation to share with players worldwide the latest news on Final Fantasy XV. Then at 13:00 local time (12AM EST) Hashimoto and Chiba will go on stage for 35 minutes to talk more on World of Final Fantasy. It’s unknown if this is the date of the promised “late January” Active Time Report but we expect Square Enix to make a formal announcement as we close in on the end of the month. Last year at TPGS details from a similar stage event were shared a few days later in their own dedicated ATR so it’s likely the same thing could happen here, perhaps even reversed.
def GetNicknames(self, domain, username): try: client = AppsClient(domain=domain) client.auth_token = self.Get2loToken() client.ssl = True feed = client.RetrieveNicknames(username) nicknames = [] for entry in feed.entry: nicknames.append(entry.nickname.name) return nicknames except: return ['An error occured while retriving Nicknames for the user.']
package com.arman.plugins.marktext; import com.intellij.ide.scratch.ScratchUtil; import com.intellij.openapi.file.exclude.EnforcedPlainTextFileTypeFactory; import com.intellij.openapi.file.exclude.EnforcedPlainTextFileTypeManager; import com.intellij.openapi.fileTypes.FileType; import com.intellij.openapi.fileTypes.FileTypeManager; import com.intellij.openapi.fileTypes.FileTypes; import com.intellij.openapi.vfs.VirtualFile; import com.intellij.openapi.vfs.VirtualFileWithId; public class Util { public static boolean containsPlainTextFile(VirtualFile file) { if (!file.isDirectory()) { return EnforcedPlainTextFileTypeManager.getInstance().isMarkedAsPlainText(file); } for (VirtualFile f : file.getChildren()) { if (containsPlainTextFile(f)) { return true; } } return false; } public static boolean isApplicableFor(VirtualFile file) { if (file instanceof VirtualFileWithId && !file.isDirectory()) { if (ScratchUtil.isScratch(file)) { return false; } else { FileType originalType = FileTypeManager.getInstance().getFileTypeByFileName(file.getName()); return !originalType.isBinary() && originalType != FileTypes.PLAIN_TEXT; } } else { return false; } } }
The Root Extract of Scutellaria baicalensis Induces Apoptosis in EGFR TKI-Resistant Human Lung Cancer Cells by Inactivation of STAT3 Resistance to epidermal growth factor receptor tyrosine kinase inhibitors (EGFR TKIs) is a major obstacle in managing lung cancer. The root of Scutellaria baicalensis (SB) traditionally used for fever clearance and detoxification possesses various bioactivities including anticancer effects. The purpose of this study was to investigate whether SB exhibited anticancer activity in EGFR TKI-resistant lung cancer cells and to explore the underlying mechanism. We used four types of human lung cancer cell lines, including H1299 (EGFR wildtype; EGFR TKI-resistant), H1975 (acquired TKI-resistant), PC9/ER (acquired erlotinib-resistant), and PC9/GR (acquired gefitinib-resistant) cells. The ethanol extract of SB (ESB) decreased cell viability and suppressed colony formation in the four cell lines. ESB stimulated nuclear fragmentation and the cleavage of poly(ADP-ribose) polymerase (PARP) and caspase-3. Consistently, the proportion of sub-G1 phase cells and annexin V+ cells were significantly elevated by ESB, indicating that ESB induced apoptotic cell death in EGFR TKI-resistant cells. ESB dephosphorylated signal transducer and activator of transcription 3 (STAT3) and downregulated the target gene expression. The overexpression of constitutively active STAT3 reversed ESB-induced apoptosis, suggesting that ESB triggered apoptosis in EGFR TKI-resistant cells by inactivating STAT3. Taken together, we propose the potential use of SB as a novel therapeutic for lung cancer patients with EGFR TKI resistance. Introduction Lung cancer is a leading cause of cancer-related mortality worldwide, accounting for 25% of all cancer deaths . Although the treatment for lung cancer has advanced significantly in recent decades, the 5-year survival rate of lung cancer is still less than 20% . Late-stage diagnosis is related to the poor prognosis of patients with lung cancer. Almost 80% of lung cancer is diagnosed when local invasion or distal metastases has already occurred and the 5-year survival rate for patients with stage IV lung cancer is only 5% . Platinum-based chemotherapy is the principal treatment for advanced non-small-cell lung cancer (NSCLC) . However, chemotherapy drugs can cause a variety of side effects, and the response rate drops when drug resistance develops . Therefore, it is fundamental to develop novel therapeutic strategies for the treatment of lung cancer. Epidermal growth factor receptor (EGFR) mutations are found in approximately 38% of NSCLC patients. EGFR mutations are more frequent in females, non-smokers, and Asian patients . Most EGFR mutations are either short deletions in exon 19 or an L858R substitution in exon 20 , and these EGFR-activating mutations subsequently stimulate EGFR downstream signaling pathways, which leads to the proliferation and 2 of 17 invasion of cancer cells . Notably, first-generation EGFR tyrosine kinase inhibitors (TKIs), including gefitinib and erlotinib, significantly reduced the tumor size and improved the overall survival of NSCLC patients with EGFR mutations . Clinical guidelines also recommended that EGFR mutation-positive patients received first-line therapy with EGFR TKIs . Despite the benefits of EGFR TKIs, acquired resistance develops within one year of treatment and weakens the efficacy of these drugs . The most common mechanism of the resistance to EGFR TKIs is the EGFR T790M mutation, which accounts for about 60% of the cases with acquired resistance . The activation of alternative pathways, such as MET amplification or HER2 upregulation, impairment of the pathway involved in the EGFR TKI-induced apoptosis, such as BIM regulation, and histologic transformation are also considered to confer EGFR TKI resistance . Although next-generation EGFR TKIs, including afatinib, an irreversible EGFR inhibitor, and osimertinib, the third-generation TKI that targets the EGFR T790M mutation, have been developed to overcome EGFR TKI resistance, the clinical benefit is limited and additional resistance to these drugs has been reported . Another strategy is combination therapy, which is to add a new agent to EGFR TKIs. However, combination therapy is unlikely to overcome the resistance derived from the EGFR T790M mutation and shows overlapping toxicities . Therefore, the development of new therapeutic drugs for the treatment of lung cancer patients with EGFR TKI resistance is urgently needed. The root of Scutellaria baicalensis (SB) is one of the fundamental herbs used in traditional Oriental medicine. According to traditional herbology, it plays a role in clearing heat and drying dampness. It has been applied in the treatment of hepatitis, hypertension, and acute infections of the respiratory or gastrointestinal tracts for thousands of years . Recent studies have reported that SB possessed various pharmacological activities, such as neuroprotective, liver protective, anti-inflammatory, antibacterial, and antioxidant effects . Furthermore, SB exhibited anticancer activities in a wide range of cancer cells . The SB extracts not only suppressed cell cycle progression and migration, but also induced apoptosis in human lung cancer cells . However, the anticancer effects of SB in EGFR TKI-resistant lung cancer cells, and the role of signal transducer and activator of transcription 3 (STAT3) in the SB-induced apoptosis have not been elucidated yet. In this study, we investigated whether SB exerted anticancer activities in EGFR TKI-resistant lung cancer cells and explored the underlying mechanism. Identification of Baicalin in the Ethanol Extract of SB by HPLC-MS Analysis We performed high-performance liquid chromatography-mass spectrometry (HPLC-MS) to identify baicalin, a constituent marker of SB, in the ethanol extract of SB (ESB). As displayed in Figure 1A, a baicalin peak was detected at a retention time of 12.977 min ( Figure 1A and Table 1). The chromatogram of ESB also showed a peak at a retention time of 13.147 min similar to that of baicalin ( Figure 1B and Table 1). The molecular weight of the indicated peak in the chromatogram of ESB and that of baicalin was the same as m/z 447.1 + , suggesting that the indicated peak was baicalin (Table 1). To quantify the amount of baicalin, we further conducted HPLC-MS/MS analysis. According to the quantification data, ESB contained approximately 18% of baicalin (Table 1). The mass spectral and quantification data of baicalin is presented in Table 1. Establishment of EGFR TKI-Resistant PC9 Cell Lines We established EGFR TKI-resistant cell lines by exposing EGFR TKI-sensitive PC human NSCLC cells to increasing concentrations of erlotinib or gefitinib. It took fo months until stable resistant cell lines, referred to as PC9/ER (erlotinib-resistant PC9) an PC9/GR (gefitinib-resistant PC9), were established. To verify whether EGFR TKI r sistance was well-generated, the MTT assay and anchorage-dependent colony formatio assay were conducted. As shown in Figure 2A,B, the cell viability and colony-formin ability of the PC9/ER and PC9/GR cells were significantly higher than those of the PC cells following erlotinib or gefitinib treatment (Figure 2A,B). The annexin V-PI doub staining assay results also showed that 72 h treatment with EGFR TKIs did not indu apoptosis in PC9/ER and PC9/GR cells, while it markedly increased the rate of apoptot cells in PC9 cells ( Figure 2C). Collectively, these results demonstrate that EGFR TKI-r sistant cell lines were well-established. We also verified that H1299 EGFR wildtype ce and H1975 acquired EGFR TKI-resistant cells showed low sensitivity to erlotinib and g fitinib, which was proven by MTT assay (Supplementary Figure S1A,B). Establishment of EGFR TKI-Resistant PC9 Cell Lines We established EGFR TKI-resistant cell lines by exposing EGFR TKI-sensitive PC9 human NSCLC cells to increasing concentrations of erlotinib or gefitinib. It took four months until stable resistant cell lines, referred to as PC9/ER (erlotinib-resistant PC9) and PC9/GR (gefitinib-resistant PC9), were established. To verify whether EGFR TKI resistance was well-generated, the MTT assay and anchorage-dependent colony formation assay were conducted. As shown in Figure 2A,B, the cell viability and colony-forming ability of the PC9/ER and PC9/GR cells were significantly higher than those of the PC9 cells following erlotinib or gefitinib treatment (Figure 2A,B). The annexin V-PI double staining assay results also showed that 72 h treatment with EGFR TKIs did not induce apoptosis in PC9/ER and PC9/GR cells, while it markedly increased the rate of apoptotic cells in PC9 cells ( Figure 2C). Collectively, these results demonstrate that EGFR TKI-resistant cell lines were well-established. We also verified that H1299 EGFR wildtype cells and H1975 acquired EGFR TKI-resistant cells showed low sensitivity to erlotinib and gefitinib, which was proven by MTT assay (Supplementary Figure S1A . The data are expressed as the mean ± SD of three independent experiments. Significance was determined by the Student's t-test (*** p < 0.001 vs. untreated controls). NSCLC, non-small-cell lung cancer; EGFR TKI, Epidermal growth factor receptor tyrosine kinase inhibitor; SD, standard deviation. Inhibition of Cell Growth by ESB in EGFR TKI-Resistant Cell Lines We first investigated the effects of ESB on the cell survival of EGFR TKI-resistant human NSCLC cell lines. We used four types of cell lines, including H1299 (EGFR wild-type; EGFR TKI-resistant), H1975 (EGFR L858R/T790M double-mutant; acquired EGFR TKIresistant), PC9/ER (acquired erlotinib-resistant), and PC9/GR (acquired gefitinib-resistant) cell lines. The MTT assay results showed that ESB dose-dependently decreased the cell viability of the EGFR TKI-resistant cell lines ( Figure 3A). We obtained the same results when cell viability was measured by the trypan blue exclusion assay. ESB suppressed the cell growth of EGFR TKI-resistant cell lines in time-and concentration-dependent manners ( Figure 3B). Taken together, our results suggest that ESB inhibited the cell growth of EGFR TKI-resistant human NSCLC cells. Inhibition of Cell Growth by ESB in EGFR TKI-Resistant Cell Lines We first investigated the effects of ESB on the cell survival of EGFR TKI-res human NSCLC cell lines. We used four types of cell lines, including H1299 (EGFR type; EGFR TKI-resistant), H1975 (EGFR L858R/T790M double-mutant; acquired TKI-resistant), PC9/ER (acquired erlotinib-resistant), and PC9/GR (acquired gefitin sistant) cell lines. The MTT assay results showed that ESB dose-dependently decr the cell viability of the EGFR TKI-resistant cell lines ( Figure 3A). We obtained the results when cell viability was measured by the trypan blue exclusion assay. ESB pressed the cell growth of EGFR TKI-resistant cell lines in time-and concentratio pendent manners ( Figure 3B). Taken together, our results suggest that ESB inhibite cell growth of EGFR TKI-resistant human NSCLC cells. Inhibition of Colony Formation by ESB in EGFR TKI-Resistant Cell Lines We next explored the effects of ESB on the colony-forming ability of EGFR T sistant human NSCLC cell lines. As a pivotal step in tumorigenesis, cancer cells gr form colonies. To mimic the tumorigenic environment, we performed the anchorag pendent colony formation assay as well as the anchorage-independent soft agar assa observed that both the EGFR wildtype H1299 cells (EGFR TKI-resistant) and the acq EGFR TKI-resistant H1975 cells formed colonies 10-15 days after seeding. As disp in Figure 4A, ESB suppressed the anchorage-dependent colony formation in both cel ( Figure 4A). In addition, the anchorage-independent colony formation was also obvi reduced by ESB treatment in a concentration-dependent manner ( Figure 4B). Thu observations demonstrate that ESB suppressed the colony-forming ability of EGFR resistant human NSCLC cells. Significance was determined by the Student's t-test (*** p < 0.001 vs. untreated controls). ESB, ethanol extract of the root of Scutellaria baicalensis; NSCLC, non-small-cell lung cancer; EGFR TKI, Epidermal growth factor receptor tyrosine kinase inhibitor; SD, standard deviation. Inhibition of Colony Formation by ESB in EGFR TKI-Resistant Cell Lines We next explored the effects of ESB on the colony-forming ability of EGFR TKI-resistant human NSCLC cell lines. As a pivotal step in tumorigenesis, cancer cells grow to form colonies. To mimic the tumorigenic environment, we performed the anchorage-dependent colony formation assay as well as the anchorage-independent soft agar assay. We observed that both the EGFR wildtype H1299 cells (EGFR TKI-resistant) and the acquired EGFR TKIresistant H1975 cells formed colonies 10-15 days after seeding. As displayed in Figure 4A, ESB suppressed the anchorage-dependent colony formation in both cell lines ( Figure 4A). In addition, the anchorage-independent colony formation was also obviously reduced by ESB treatment in a concentration-dependent manner ( Figure 4B). Thus, our observations demonstrate that ESB suppressed the colony-forming ability of EGFR TKI-resistant human NSCLC cells. ure 4. Effects of ESB on the colony formation of EGFR TKI-resistant human NSCLC cell lines. H1299 (EGFR wildtyp FR TKI-resistant) and H1975 (acquired TKI-resistant) human NSCLC cell lines were seeded as a single cell suspensio 12-well plates (A) or soft agar (B) and treated with the indicated concentrations of ESB for 10-15 days. The colonie re photographed by a digital camera (upper panel), and the number of colonies was counted using ImageJ softwar wer panel). The data are expressed as the mean ± SD of three independent experiments. Significance was determine the Student's t-test (** p < 0.01, *** p < 0.001 vs. untreated controls). ESB, ethanol extract of the root of Scutellaria ba ensis; NSCLC, non-small-cell lung cancer; EGFR TKI, Epidermal growth factor receptor tyrosine kinase inhibitor; SD ndard deviation. Induction of Apoptosis by ESB in EGFR TKI-Resistant Cell Lines To examine whether the inhibition of cell growth and colony formation by ES related to apoptosis induction, nuclear 4′,6-diamidino-2-phenylindole (DAPI) st was performed. Chromatin condensation and nuclear fragmentation, typical mark apoptosis, were dose-dependently increased by ESB treatment in EGFR TKI-resista man NSCLC cell lines ( Figure 5A). To validate the results of DAPI staining, Wester analysis was conducted. The results showed that the cleavage of caspase-3 and P apoptosis marker proteins, were upregulated by ESB in EGFR TKI-resistant cell line ure 5B). We next monitored apoptosis by flow cytometry. The 72-h treatment of ESB edly increased the sub-G1 population, generally considered to be apoptotic cells, in centration-dependent manner ( Figure 5C and Supplementary Figure S2A). Cell cy rest at the G1/S-or G2/M-phase was not detected. In addition, the percentage of an V-positive apoptotic cells was also significantly increased by ESB in these cell lines (F 5D and Supplementary Figure S2B). Taken together, our results demonstrate tha triggered apoptosis in EGFR TKI-resistant human NSCLC cells. , and the number of colonies was counted using ImageJ software (lower panel). The data are expressed as the mean ± SD of three independent experiments. Significance was determined by the Student's t-test (** p < 0.01, *** p < 0.001 vs. untreated controls). ESB, ethanol extract of the root of Scutellaria baicalensis; NSCLC, non-small-cell lung cancer; EGFR TKI, Epidermal growth factor receptor tyrosine kinase inhibitor; SD, standard deviation. Induction of Apoptosis by ESB in EGFR TKI-Resistant Cell Lines To examine whether the inhibition of cell growth and colony formation by ESB was related to apoptosis induction, nuclear 4 ,6-diamidino-2-phenylindole (DAPI) staining was performed. Chromatin condensation and nuclear fragmentation, typical markers of apoptosis, were dose-dependently increased by ESB treatment in EGFR TKI-resistant human NSCLC cell lines ( Figure 5A). To validate the results of DAPI staining, Western blot analysis was conducted. The results showed that the cleavage of caspase-3 and PARP, apoptosis marker proteins, were upregulated by ESB in EGFR TKI-resistant cell lines ( Figure 5B). We next monitored apoptosis by flow cytometry. The 72-h treatment of ESB markedly increased the sub-G1 population, generally considered to be apoptotic cells, in a concentration-dependent manner ( Figure 5C and Supplementary Figure S2A). Cell cycle arrest at the G1/S-or G2/M-phase was not detected. In addition, the percentage of annexin V-positive apoptotic cells was also significantly increased by ESB in these cell lines ( Figure 5D and Supplementary Figure S2B). Taken together, our results demonstrate that ESB triggered apoptosis in EGFR TKI-resistant human NSCLC cells. Interestingly, the apoptosis induction by ESB was not consistently observed in EGFR TKI-sensitive PC9 cell line. As shown in Supplementary Figure 3A, the growth-inhibitory effect of ESB in PC9 cells was not as high as in the other EGFR TKI-resistant cell lines. Low concentration of ESB (25 μg/mL) did not show any cytotoxicity in PC9 cells, while the same concentration of ESB definitely suppressed the cell growth in EGFR TKI-resistant cell lines. In addition, the cell viability of PC9 cells after ESB treatment at 100 μg/mL was 57.28%, whereas that of EGFR TKI-resistant cell lines was ≤ 50% (Supplementary Figure Interestingly, the apoptosis induction by ESB was not consistently observed in EGFR TKI-sensitive PC9 cell line. As shown in Supplementary Figure S3A, the growth-inhibitory effect of ESB in PC9 cells was not as high as in the other EGFR TKI-resistant cell lines. Low concentration of ESB (25 µg/mL) did not show any cytotoxicity in PC9 cells, while the same concentration of ESB definitely suppressed the cell growth in EGFR TKI-resistant cell lines. In addition, the cell viability of PC9 cells after ESB treatment at 100 µg/mL was 57.28%, whereas that of EGFR TKI-resistant cell lines was ≤50% (Supplementary Figure S3A). Furthermore, the proportion of sub-G1 phase cells and annexin V+ cells was not increased by ESB in PC9 cells, indicating that ESB didn't induce apoptosis in PC9 cell line (Supplementary Figure S3B,C). These results suggest that EGFR TKI-resistant cell lines show higher sensitivity to ESB than EGFR TKI-sensitive cell line. Inactivation of STAT3 Mediates ESB-Induced Apoptosis in EGFR TKI-Resistant Cell Lines We next explored the molecular mechanism underlying the ESB-induced apoptosis in EGFR TKI-resistant cell lines by focusing on the activity of STAT3. Given that STAT3 is frequently activated in NSCLC and controls major cancer properties, STAT3 has been considered a promising target for the treatment of NSCLC . Interestingly, EGFR TKI-resistant tumors displayed increased STAT3 phosphorylation compared to EGFR TKI treatment-naïve or sensitive tumors . Growing evidence suggests that STAT3 may be a key mediator of both primary and acquired resistance to EGFR TKIs . Based on the previous studies, we hypothesized that ESB induced apoptosis in EGFR TKI-resistant cells by regulating STAT3 activity. Consistently with our hypothesis, ESB suppressed the phosphorylation of STAT3 in a time-and concentration-dependent manner in EGFR TKI-resistant human NSCLC cell lines ( Figure 6A,B, Supplementary Figure S4). Consistently, the protein expression of cyclin D1 and survivin target genes transcriptionally regulated by STAT3, was diminished by ESB in a time-dependent manner, suggesting that ESB suppressed the transcriptional activity of STAT3 in EGFR TKI-resistant cell lines ( Figure 6C). To verify the role of STAT3 inactivation in ESB-induced apoptosis, EGFR TKI-resistant cell lines were transfected with constitutively active STAT3 (STAT3 CA) and treated with ESB. At 48 h post-transfection with STAT3 CA, H1299 and H1975 cells exhibited a strong expression of phospho-STAT3 compared to empty vector (EV)-transfected cells, suggesting that the STAT3 CA plasmid was successfully transfected ( Figure 7A,B). Notably, the overexpression of STAT3 CA in H1299 and H1975 cells partially reversed the cytotoxicity of ESB. The percentage of apoptotic cells in EV-transfected cells was increased up to 48.38 ± 3.39% and 50.34 ± 0.25% in H1299 and H1975 cells, respectively, whereas that of STAT3 CAtransfected cells was only 27.23 ± 3.79% and 22.48 ± 0.63%, respectively, for each cell line ( Figure 7C,D). In summary, our observations suggest that ESB triggered apoptosis in EGFR TKI-resistant human NSCLC cells by suppressing STAT3 activity. Notably, the inhibitory effect of ESB on STAT3 activity was not observed in EGFR TKI-sensitive PC9 cell line (Supplementary Figure S3D). Given that ESB didn't induce apoptosis in PC9 cells (Supplementary Figure S3B,C), these results support our conclusion that the anticancer effects of ESB is mediated by STAT3 inhibition. phosphorylation of STAT3 in a time-and concentration-dependent manner in resistant human NSCLC cell lines ( Figure 6A,B, Supplementary Figure S4). C the protein expression of cyclin D1 and survivin target genes transcriptional by STAT3, was diminished by ESB in a time-dependent manner, suggesting t pressed the transcriptional activity of STAT3 in EGFR TKI-resistant cell lines Figure 6. Suppression of STAT3 activity by ESB in EGFR TKI-resistant human NSCLC cell lines. H1299 (EGFR w EGFR TKI-resistant), H1975 (acquired TKI-resistant), PC9/ER (acquired erlotinib-resistant), and PC9/GR (acqu fitinib-resistant) human NSCLC cell lines were treated with 100 μg/mL ESB for the indicated periods. (A) The phosphorylated and total STAT3 were detected by Western blot analysis. Actin was used as a loading control. ratio of phosphorylated/total protein was calculated using ImageJ software after normalization to actin. The dat pressed as the mean ± SD of duplicate experiments. Significance was determined by the Student's t-test (* p < 0.0 0.01, *** p < 0.001 vs. untreated controls). (C) The protein expression of STAT3 target genes was detected by Wes analysis. Actin was used as a loading control. ESB, ethanol extract of the root of Scutellaria baicalensis; NSCLC, no cell lung cancer; EGFR TKI, Epidermal growth factor receptor tyrosine kinase inhibitor; STAT3, signal transdu activator of transcription 3; SD, standard deviation. were detected by Western blot analysis. Actin was used as a loading control. (B) The ratio of phosphorylated/total protein was calculated using ImageJ software after normalization to actin. The data are expressed as the mean ± SD of duplicate experiments. Significance was determined by the Student's t-test (* p < 0.05, ** p < 0.01, *** p < 0.001 vs. untreated controls). (C) The protein expression of STAT3 target genes was detected by Western blot analysis. Actin was used as a loading control. ESB, ethanol extract of the root of Scutellaria baicalensis; NSCLC, non-small-cell lung cancer; EGFR TKI, Epidermal growth factor receptor tyrosine kinase inhibitor; STAT3, signal transducer and activator of transcription 3; SD, standard deviation. Notably, the overexpression of STAT3 CA in H1299 and H1975 cells partially reversed the cytotoxicity of ESB. The percentage of apoptotic cells in EV-transfected cells was increased up to 48.38 ± 3.39% and 50.34 ± 0.25% in H1299 and H1975 cells, respectively, whereas that of STAT3 CA-transfected cells was only 27.23 ± 3.79% and 22.48 ± 0.63%, respectively, for each cell line ( Figure 7C,D). In summary, our observations suggest that ESB triggered apoptosis in EGFR TKI-resistant human NSCLC cells by suppressing STAT3 activity. Notably, the inhibitory effect of ESB on STAT3 activity was not observed in EGFR TKI-sensitive PC9 cell line (Supplementary Figure 3D). Given that ESB didn't induce apoptosis in PC9 cells (Supplementary Figure 3B,C), these results support our conclusion that the anticancer effects of ESB is mediated by STAT3 inhibition. Discussion The current study investigated the molecular mechanism by which ESB induced apoptosis in EGFR TKI-resistant human NSLCL cell lines. The novelties of this research are as follows. First, we evaluated the anticancer effects of ESB in EGFR TKI-resistant human NSLCL cell lines for the first time. Although previous studies have reported that ESB triggered apoptosis and cell cycle arrest and inhibited the migration and invasion of human lung cancer cells , they did not focus on the activity of ESB in EGFR TKI-resistant cell lines. In this study, we used various cell lines with EGFR TKI-resistance, including H1299, H1975, PC9/ER, and PC9/GR cells, to access whether ESB could be used for the treatment of NSCLC with EGFR TKI resistance. Our results demonstrated that ESB reduced cell growth and colony formation in EGFR TKI-resistant cells by inducing apoptosis. Second, to the best of our knowledge, this was the first study to illustrate that ESB regulated STAT3 activity. No study has reported the role of STAT3 in the pharmacological activity of the crude extract of SB, including anticancer effects. The reason we postulated STAT3 as a potential target of ESB was based on the following points: (i) STAT3 frequently overactivated in NSCLC contributed to cancer progression by regulating major cancer hallmarks, such as cell proliferation, angiogenesis, metastasis, the evasion of immune surveillance, and chemoresistance , (ii) the level of STAT3 phosphorylation was higher in the EGFR TKI-resistant tumors than in the EGFR TKI-naïve or -sensitive tumors , and (iii) previous studies postulated STAT3 as a key mediator of both primary and acquired resistance to EGFR TKIs . Our results showed that STAT3 was dephosphorylated by ESB in EGFR TKI-resistant human NSCLC cells. In addition, the overexpression of constitutively active STAT3 reversed ESB-induced apoptosis. These results collectively demonstrate that the inactivation of STAT3 was involved in the anticancer activity of ESB. It has been suggested that a variety of pathways are involved in the anticancer effects of SB and its constituents . For example, NF-κB was inactivated by SB extract and its constituents, including baicalein or wogonin . According to our preliminary data, p65 NF-κB phosphorylation was not decreased, and even slightly increased by ESB in EGFR TKI-resistant cells, suggesting that NF-κB was not involved in ESB-induced apoptosis. The dephosphorylation of AKT and mTOR protein is also reported to mediate the anticancer effects of SB and its constituents . However, our preliminary data showed that while phosphorylated AKT was slightly downregulated until 12 h posttreatment with ESB, it recovered to control levels at 24 h, dismissing the possibility that AKT inactivation mediated the anticancer effects of ESB. ERK has been reported as another target of SB. ERK was inactivated by baicalein, wogonin, and oroxylin A to inhibit cell growth, migration, and the invasion of cancer cells . Consistently, our preliminary data also showed that ERK phosphorylation was commonly decreased by ESB in EGFR TKI-resistant cell lines, suggesting that ERK can be another candidate involved in ESB-induced apoptosis. This might also explain why the overexpression of constitutively active STAT3 in EGFR TKI-resistant cells did not completely reverse ESB-induced apoptosis. In that case, other upstream ERK kinases, except for STAT3, would have regulated the activity of ERK in a compensatory mechanism. However, it can be still possible that the activity of STAT3 was partially suppressed by ESB even after the transfection of constitutively active STAT3, resulting in the incomplete reversal of ESB-induced apoptosis. In a future study, the specific ESB compound that causes apoptosis in EGFR TKIresistant cell lines should be determined. Previous studies have reported that several active compounds of SB suppressed STAT3 activity. For example, baicalin inhibited amyloid β-induced microglial cell activation and promoted neuronal differentiation of neural stem cells by inhibiting STAT3 phosphorylation . Another constituent of SB, baicalein attenuated metastatic potential of breast cancer cells by regulating STAT3 activity . Oroxylin A also decreased STAT3 activity to suppress colitis-stimulated carcinogenesis . Therefore, we propose these compounds as putative ingredients for exerting anticancer effects in the EGFR TKI-resistant cell lines. However, whether other compounds of SB can modulate the STAT3 signaling pathway should be extensively investigated. Whether ESB attenuates cell growth and triggers apoptosis in other cancer cell types with EGFR TKI resistance, such as pancreatic and breast cancer cells, is also an important issue in predicting the possibility of applying ESB to various cancer types . Interestingly, our results showed that ESB did not induce apoptosis, nor did ESB suppress STAT3 phosphorylation in the EGFR TKI-sensitive PC9 cell line. Whether ESB shows selectivity for EGFR TKI-resistant cells and the precise mechanism should be further investigated in the future study. Taken together, our results showed that ESB reduced cell growth and induced apoptosis in EGFR TKI-resistant human NSCLC cell lines by suppressing STAT3 activity. We pro-vide basic information about the potential anticancer effects of ESB in EGFR TKI-resistant NSCLC with highly activated STAT3. Preparation of ESB The dried root of SB was purchased from Bonchomaru (Seoul, Korea). SB (50 g) was pulverized into powder and extracted with 800 mL of 80% ethanol at 40 • C with shaking (150 rpm). After 72 h, the extract was collected, and the SB was extracted again with 300 mL of 80% ethanol under the same conditions. After 24 h, the extract was combined with the previous one, concentrated by a vacuum rotary evaporator under reduced pressure, and lyophilized. The yield was 42.28%. The ESB used in the experiments was prepared by dissolving the powder in dimethyl sulfoxide (DMSO; Amresco, Solon, OH, USA) at 200 mg/mL as a stock solution. HPLC-MS/MS Analysis HPLC analysis was conducted on a Dionex UltiMate 3000 UHPLC system (Thermo Fisher Scientific, San Jose, CA, USA) using Thermo Chromeleon 7 software (Thermo Fisher Scientific, San Jose, CA, USA). Baicalin (ChemFaces, Wuhan, China) was dissolved in 10% DMSO in methanol. The separation was performed on a YMC Triart C18 column (150 × 2.0 mm, 3 µm) using 1% acetic acid in distilled water (DW) and 1% acetic acid in 70% acetonitrile as solvent A and solvent B, respectively. The mobile conditions were as follows: 75% solvent A and 25% solvent B for 10 min, 68% solvent A for 10 min, 55% solvent A for 10 min, 55% solvent A for 4 min, 52% solvent A for 11 min, 75% solvent A for 5 min, and holding for 5 min. The flow rate was 0.2 mL/min and the column temperature was maintained at 40 • C. The detection wavelength was 275 nm. MS analysis was performed on a Compact mass spectrometer LC-MS system (Advion, Ithaca, NY, USA). The mass spectra were recorded over m/z 100-1200 with a scan speed of 1100 (scan time). To quantify the amount of baicalin, HPLC-MS/MS analysis was performed. HPLC analysis was carried out using a 1200 series LC system equipped with a G1322A degasser, G1312A pump, a G1367D autosampler and a G1316A oven (Agilent Technologies, Palo Alto, CA, USA). Chromatographic separation was performed on a InfinityLab Poroshell 120 EC-C18 column (2.1 × 100 mm, 2.7 µm, Agilent Technologies, Folsom, CA, USA). The binary solvent system consisted of 0.1% Formic acid, 5 mM Ammonium Formate in DW (A) and 0.1% Formic acid, 5 mM Ammonium Formate in Methanol (B). The mobile conditions were as follows: 80% solvent A and 20% solvent B for 1 min, 5% solvent A for 1.1 min, and holding 5.4 min, 80% solvent A for 6.6 min, and holing 3. Cell Culture H1299, H1975, and PC9 human NSCLC cell lines were kind gifts from Professor Ho-Young Lee (Seoul National University, Seoul, Korea). RPMI-1640 (WelGENE, Daegu, Korea) with 10% fetal bovine serum (FBS; WelGENE, Daegu, Korea) and 1% antibiotics (WelGENE, Daegu, Korea) was used as the culture medium. The cells were sub-cultured every three days and maintained at 37 • C under 5% CO 2 . Erlotinib-resistant PC9 (PC9/ER) and gefitinib-resistant PC9 cell lines (PC9/GR) were established by exposing PC9 cells to increasing concentrations of erlotinib (LC Laboratories, Woburn, MA, USA) or gefitinib (Cayman Chemical, Ann Arbor, MI, USA). The PC9 cells were treated with 0.1 µM of erlotinib or gefitinib as a starting concentration. The culture medium was changed to fresh medium every three days until the cells reached confluence. The cells were then incubated with gradually increasing concentrations of EGFR TKIs for four months. The final concentrations of the drugs were 50 µM for erlotinib and 25 µM for gefitinib. The PC9/ER and PC9/GR cell lines were cultured in medium containing erlotinib (25 µM) or gefitinib (12.5 µM), respectively. MTT Assay Cells (3 × 10 3 ) were plated into 96-well plates and stabilized overnight. The culture medium was replaced with fresh medium (120 µL) containing the indicated drugs. After 72 h of incubation, 120 µL of MTT solution (4 mg/mL) was added to each well. After 2 h of incubation at 37 • C, the culture media was suctioned from the plates. Then, 100 µL of DMSO was added to each well as an MTT solvent. The plates were shaken on an orbital shaker for 10 min to fully solubilize the formazan crystals. The absorbance of each well at 540 nm was measured using a microplate reader (SpectraMax M3; Molecular Devices, San Jose, CA, USA). Trypan Blue Exclusion Assay Cells (2 × 10 4 ) were plated in 12-well plates and stabilized overnight. The cells were treated with various concentrations of ESB and collected 24-72 h posttreatment with ESB. The harvested cells were washed and resuspended in 1 mL of phosphate-buffered saline (PBS, WelGENE, Daegu, Korea). Then, 0.1 mL of 0.4% trypan blue solution (WelGENE, Daegu, Korea) was added to 0.1 mL of cell suspension. The number of live cells was counted using a hemocytometer under a microscope (Leica, Wetzlar, Germany) by excluding the blue-stained cells. Colony Formation Assay For the anchorage-dependent colony formation assay, 2 × 10 2 cells were plated in 12-well plates. After stabilizing overnight, the cells were treated with the indicated drugs. The cells were incubated at 37 • C until colonies were fully formed and grown, and the culture medium was replaced with fresh medium every three days. Ten days posttreatment, the colonies were fixed with 100% methanol for 5 min, stained with hematoxylin (Sigma-Aldrich, St. Louis, MO, USA) for 30 min, and washed with distilled water before obtaining images. For the anchorage-independent soft agar assay, 4% SeaPlaque agarose (Lonza, Rockland, ME, USA) was diluted with warm culture media at a ratio of 1:3 to make 1% bottom agar and overlaid on each well of 24-well plates. After the bottom agar was solidified, 0.4% agar (top agar) containing 1 × 10 3 cells was added onto the bottom agar and left to solidify at room temperature. Then, 0.5 mL of warm culture media containing the indicated concentrations of ESB was added onto the top agar. The cells were incubated at 37 • C until colonies were fully formed and grown, and the culture media was replaced with fresh medium every three days. After 15 days of incubation, MTT solution (4 mg/mL) was added to the culture media at 0.5 mg/mL. The colonies were stained with MTT solution for 2 h at 37 • C. Images of the colonies were taken with a digital camera (Canon, Tokyo, Japan) and the number of colonies was measured using ImageJ software (version 1.52a). DAPI Staining Cells (1 × 10 5 ) were plated in 6-well plates, stabilized overnight, and challenged with different concentrations of ESB. The cells were harvested and fixed with 3.7% paraformaldehyde (Sigma-Aldrich) for 30 min at 4 • C. After washing with cold PBS, the cells were attached to slide glasses using a Cytospin (Shandon Inc., Pittsburgh, PA, USA). Then, the cells were stained with 2.5 µg/mL DAPI solution for 20 min at room temperature in the dark for nuclear staining, washed with PBS, and mounted with mounting solution (Biomeda, Foster City, CA, USA). The nuclei were observed under fluorescence microscopy at ×200 magnification (Carl Zeiss, AG, Germany). Flow Cytometry Cells (1 × 10 5 ) were seeded in 6-well plates, stabilized overnight, and treated with the indicated drugs. The cells were collected and subjected to cell cycle analysis. The cells were treated with cold 80% ethanol for 1 h at 4 • C for fixation and stained with 50 µg/mL propidium iodide (PI) solution (Sigma-Aldrich, St. Louis, MO, USA) containing 30 µg/mL RNase A (Sigma-Aldrich, St. Louis, MO, USA) for 30 min in the dark. After centrifugation, the supernatant was discarded, and the cells were resuspended in 500 µL of PBS. The percentage of cells in each phase of the cell cycle was measured by flow cytometry (FACSCaliber, Becton Dickinson and Company, San Jose, CA, USA). The cells in the sub-G1 fraction were considered apoptotic cells. For the annexin V-PI double-staining assay, the cells were stained with both annexin V-FITC and PI using the Annexin V-FITC Apoptosis Detection Kit I (BD Biosciences Pharmingen, San Diego, CA, USA) as described by the manufacturer. Then, the annexin-and/or PI-stained cells were measured by flow cytometry. The annexin V+ cells were considered apoptotic cells. Western Blots Cells (2 × 10 5 ) were seeded in 60-mm dishes and treated with the indicated drugs. The cells were collected and lysed for 1 h on ice ibitor cocktail (Thermo Fisher Scientific, San Jose, CA, USA) and phosphatase inwith cold RIPA buffer (Thermo Fisher Scientific, San Jose, CA, USA) with added protease inhhibitors (1mM Na 3 VO 4 and 100 mM NaF). After centrifugation at 16,000× g at 4 • C for 30 min, the protein concentration of the supernatant was measured using the bicinchoninic acid (BCA) protein assay kit (Pierce Biotechnology, Rockford, IL, USA). Protein (20 µg) from each sample was subjected to sodium dodecyl sulfate (SDS)-polyacrylamide gel electrophoresis and transferred to polyvinylidene fluoride (PVDF) membranes. After blocking with 3% bovine serum albumin (BSA, GenDEPOT, Barker, TX, USA) for 30 min at room temperature, the membranes were washed with TBST for 10 min and probed with the specific primary antibodies at 1:1000 dilutions overnight at 4 • C. The membrane was then washed with TBST for 1 h and treated with the appropriate secondary antibody solution (1:10,000 dilution in 3% skim milk) for 1 h at room temperature. Protein expression was detected by the D-Plus ECL Femto System (Donginbio, Seoul, Korea). The primary antibodies against cyclin D1 and actin were purchased from Santa Cruz Biotechnology (Santa Cruz, CA, USA) and the other primary antibodies were all purchased from Cell Signaling Technology (Beverly, MA, USA). The goat anti-mouse secondary antibody and the goat anti-rabbit secondary antibody were purchased from Bethyl Laboratories (Montgomery, TX, USA) and Enzo Life Sciences (Farmingdale, NY, USA), respectively. Transfection Cells (3 × 10 5 ) were plated in 6-well plates and stabilized overnight. The cells were transfected with 1 µg of constitutively active STAT3 plasmid (pExpress-STAT3Y705D) using 3 µg of Lipofectamine 2000 (Invitrogen, Carlsbad, CA, USA) as described in the manufacturer's protocol. The constitutively active STAT3 plasmid was a gift from Professor Ho-Young Lee (Seoul National University, Seoul, Korea). After 48 h, the cells were harvested for Western blot analysis or seeded again for flow cytometry. Statistical Analyses Each result is expressed as the mean ± SD of data obtained from triplicate experiments. The statistical analysis was performed by a paired Student's t-test. Differences at p < 0.05 were considered statistically significant.
package org.springboot.samples.rest.dto; public class JobHistoryDto { }
#ifndef BOOST_ARCHIVE_ARCHIVE_EXCEPTION_HPP #define BOOST_ARCHIVE_ARCHIVE_EXCEPTION_HPP // MS compatible compilers support #pragma once #if defined(_MSC_VER) # pragma once #endif /////////1/////////2/////////3/////////4/////////5/////////6/////////7/////////8 // archive/archive_exception.hpp: // (C) Copyright 2002 <NAME> - http://www.rrsd.com . // Use, modification and distribution is subject to the Boost Software // License, Version 1.0. (See accompanying file LICENSE_1_0.txt or copy at // http://www.boost.org/LICENSE_1_0.txt) // See http://www.boost.org for updates, documentation, and revision history. #include <exception> #include <boost/assert.hpp> #include <string> #include <boost/config.hpp> #include <boost/archive/detail/decl.hpp> // note: the only reason this is in here is that windows header // includes #define exception_code _exception_code (arrrgghhhh!). // the most expedient way to address this is be sure that this // header is always included whenever this header file is included. #if defined(BOOST_WINDOWS) #include <excpt.h> #endif #include <boost/archive/detail/abi_prefix.hpp> // must be the last header namespace boost { namespace archive { ////////////////////////////////////////////////////////////////////// // exceptions thrown by archives // class BOOST_SYMBOL_VISIBLE archive_exception : public virtual std::exception { private: char m_buffer[128]; protected: BOOST_ARCHIVE_DECL unsigned int append(unsigned int l, const char * a); BOOST_ARCHIVE_DECL archive_exception() BOOST_NOEXCEPT; public: typedef enum { no_exception, // initialized without code other_exception, // any excepton not listed below unregistered_class, // attempt to serialize a pointer of // an unregistered class invalid_signature, // first line of archive does not contain // expected string unsupported_version,// archive created with library version // subsequent to this one pointer_conflict, // an attempt has been made to directly // serialize an object which has // already been serialized through a pointer. // Were this permitted, the archive load would result // in the creation of an extra copy of the obect. incompatible_native_format, // attempt to read native binary format // on incompatible platform array_size_too_short,// array being loaded doesn't fit in array allocated input_stream_error, // error on input stream invalid_class_name, // class name greater than the maximum permitted. // most likely a corrupted archive or an attempt // to insert virus via buffer overrun method. unregistered_cast, // base - derived relationship not registered with // void_cast_register unsupported_class_version, // type saved with a version # greater than the // one used by the program. This indicates that the program // needs to be rebuilt. multiple_code_instantiation, // code for implementing serialization for some // type has been instantiated in more than one module. output_stream_error // error on input stream } exception_code; exception_code code; BOOST_ARCHIVE_DECL archive_exception( exception_code c, const char * e1 = NULL, const char * e2 = NULL ) BOOST_NOEXCEPT; virtual BOOST_ARCHIVE_DECL ~archive_exception() BOOST_NOEXCEPT_OR_NOTHROW ; virtual BOOST_ARCHIVE_DECL const char * what() const BOOST_NOEXCEPT_OR_NOTHROW ; }; }// namespace archive }// namespace boost #include <boost/archive/detail/abi_suffix.hpp> // pops abi_suffix.hpp pragmas #endif //BOOST_ARCHIVE_ARCHIVE_EXCEPTION_HPP
from django.conf.urls import url from django.views.decorators.csrf import csrf_exempt, ensure_csrf_cookie from allegation.views import AreaAPIView from allegation.views import ( OfficerAllegationGISApiView, OfficerAllegationClusterApiView) from allegation.views import PoliceWitnessAPIView from allegation.views import ( OfficerAllegationSummaryApiView, OfficerListAPIView) from allegation.views.allegation_search_view import AllegationSearchView from allegation.views.allegation_view import AllegationView from allegation.views.officer_allegation_analysis_api_view import ( OfficerAllegationAnalysisAPIView) from allegation.views.allegation_download_view import AllegationDownloadView from allegation.views.officer_allegation_race_gender_api import ( OfficerAllegationRaceGenderAPI) from allegation.views.officer_allegation_sunburst_view import ( OfficerAllegationSunburstView) from allegation.views.session_view import SessionAPIView from allegation.views.officer_allegation_api_view import ( OfficerAllegationAPIView) from allegation.views.sunburst_view import SunburstView from allegation.views.sunburst_image_view import SunburstImageView from common.middleware.cache import orderless_cache_page cache_view = orderless_cache_page(86400 * 90) urlpatterns = [ url(r'^api/officer-allegations/$', cache_view(OfficerAllegationAPIView.as_view()), name='officer-allegation-api'), url(r'^api/officer-allegations/race-gender/$', cache_view(OfficerAllegationRaceGenderAPI.as_view()), name='officer-allegation-race-gender-api'), url(r'^api/officer-allegations/analysis$', OfficerAllegationAnalysisAPIView.as_view(), name='officer-allegation-api-analysis'), url(r'^api/officer-allegations/gis/$', cache_view(OfficerAllegationGISApiView.as_view()), name='officer-allegation-api-gis'), url(r'^api/officer-allegations/cluster/$', cache_view(OfficerAllegationClusterApiView.as_view()), name='officer-allegation-api-clusters'), url(r'^api/officer-allegations/summary/$', cache_view(OfficerAllegationSummaryApiView.as_view()), name='officer-allegation-api-summary'), url(r'^api/officer-allegations/officers/$', cache_view(OfficerListAPIView.as_view()), name='officer-allegation-api-officers'), url(r'^api/officer-allegations/sunburst/$', cache_view(OfficerAllegationSunburstView.as_view()), name='officer-allegation-api-sunburst'), url(r'^api/areas/$', cache_view(AreaAPIView.as_view()), name='area-api'), url(r'^api/police-witness/$', cache_view(PoliceWitnessAPIView.as_view()), name='police-witness'), url(r'^allegations/download/', (AllegationDownloadView.as_view()), name='allegation-download'), url(r'^api/allegations/session/$', csrf_exempt(ensure_csrf_cookie(SessionAPIView.as_view())), name='allegation-api-session'), url(r'^q/(?P<term>\w+)/$', cache_view(AllegationSearchView.as_view()), name='search-q-page'), url(r'^s/(?P<term>\w+)/$', cache_view(AllegationSearchView.as_view()), name='search-s-page'), url(r'^complaint/(?P<crid>\d+)/(?P<category_slug>.*)/(?P<cat_hash>\w{8})$', cache_view(AllegationView.as_view()), name='complaint-page'), url(r'^sunburst-image/(?P<hash_id>\w{6})/$', cache_view(SunburstImageView.as_view()), name='sunburst-image'), url(r'^sunburst/(?P<hash_id>\w{6})/$', cache_view(SunburstView.as_view()), name='sunburst') ]
def plot_temperature_hist(runs): num_runs = 0 for run in runs: if len(run.thermal.data_frame): num_runs += 1 if num_runs == 0: return axis = pre_plot_setup(ncols=num_runs) if num_runs == 1: axis = [axis] for ax, run in zip(axis, runs): run.thermal.plot_temperature_hist(ax, run.name)
def largest_max_heap(h, k): aux = [(-h[0], 0)] largest = [] for _ in xrange(k): x, i = heapq.heappop(aux) largest.append(-x) for j in range(2*i+1, 2*i+3): if j < len(h): heapq.heappush(aux, (-h[j], j)) return largest
Polymorphisms in lncRNA CCAT1 on the susceptibility of lung cancer in a Chinese northeast population: A case-control study. OBJECT To explore the association of rs1948915, rs7013433 in long noncoding RNA (lncRNA) CCAT1 and rs6983267 in MYC enhancer region with the risk of lung cancer in a Chinese northeast population, a case-control study was conducted. METHODS The hospital-based case-control study contained 669 lung cancer patients and 697 healthy controls. Taqman® Probe allele resolution was used for genotyping. The differences between the case-control groups were analyzed using Student t-test and chi-square test. Logistic regression analysis was used to assess the relationship between the genotypes and the risk of lung cancer. Cross-generation analysis was used to explore the relationship between gene-environment interaction and lung cancer. RESULTS There was no association between the three selected single-nucleotide polymorphisms (SNPs) and the susceptibility of lung cancer. Rs1948915 CT was correlated with lung adenocarcinoma. In female stratification, rs1948915 CT/CC was associated with a decreased susceptibility of lung cancer significantly. Additionally, the additive and multiplicative interaction models showed that there was no interaction between the three selected SNPs and smoking status in lung cancer. CONCLUSIONS There may be an association between lung adenocarcinoma and rs1948915 polymorphism in the Chinese northeast population, while rs7013433 and rs6983267 might have no association. There was no interaction between the three selected SNPs and smoking status.
package service import ( "poc-localstack-dynamo-golang/domain/entities" "poc-localstack-dynamo-golang/domain/repository" ) type GetService interface { Get(parameter entities.Parameter) (entities.Parameter, error) } type GetServiceImpl struct { } func NewGetService() GetService { return &GetServiceImpl{} } func (s GetServiceImpl) Get(parameter entities.Parameter) (entities.Parameter, error) { managementRepository := repository.NewParameterRepository() parameterFound, err := managementRepository.Get(parameter) if err != nil { return entities.Parameter{}, err } return parameterFound, err }
Snorlax, The Sleeping Pokémon. It is not satisfied unless it eats over 880 pounds of food every day. When it is done eating, it goes promptly to sleep. Its stomach can digest any kind of food, even if it happens to be moldy or rotten.When its belly is full, it becomes too lethargic to even lift a finger, so it is safe to bounce on its belly.What sounds like its cry may actually be its snores or the rumblings of its hungry belly. Its stomach's digestive juices can dissolve any kind of poison. It can even eat things off the ground. Overview Snorlax let out a heavy sigh as he crammed himself back into his run down trailer. It was another long day at the lumber mill. Things weren't getting any better and he wasn't getting any younger. In his youth he was a star. He could use Hyper Beam and Belly Drum like none of his peers. His ability to use Curse was unmatched. Nothing was as good at what he did and nothing ever would be. But somewhere along his career something went wrong. Snorlax pushed aside the piles of unpaid bills and wedged himself back into his stained lounge chair. One greasy, tubby hand pawed at the remote, while the other buried itself in a bag of expired cheetos. He flipped the TV on just in time to watch a Mega Lucario ram a Close Combat right into the face of a Delphox. Snorlax heard a faint voice utter "It's not very effective" as the magical fox fell to the ground in agony. He changed the channel - It was an Infernape. Snorlax angrily continued channel surfing... Scizor, Blaziken, Aegislash, Skarmory, Terrakion... It was all the same. Snorlax let out a flurry of Curses. His Attack and Defense rose, but as his Speed fell he slumped back into his greasy chair and realized the undeniable truth... The world had simply changed too much for a fat bear like himself to keep up. Positives - High HP, Attack, and Special Defense give Snorlax the perfect stats to shrug off Special hits and retaliate hard. Assault Vest makes it all but unkillable on the Special side. - Ghost immunity is incredible now that Steel no longer resists it. Pursuit makes Snorlax an excellent Ghost hunter even if the concept of 'lax chasing anything down is doubtful at best. - It blocks roads well. Negatives - Tyranitar and Scizor basically do the ghost killing job better with stronger Pursuits and better typing. - Snorlax's flagship sets Hyper Beam, Belly Drum, and Curse that made him popular in the past are now completely useless. - Close Combat exists. - No recovery besides Rest / Sleep Talk prevents Snorlax from sticking around reliably. Why doesn't it learn Slack Off? - Slow. You Speed tie with Slowbro and get circles run around you by Bronzong. - Snorlax's Defense isn't very good. It will take hits okay because of its massive HP but it will crumble before boosted strikes. - Normal typing has terrible neutral coverage. No matter what you run something will always wall you. - The Pokeflute doesn't exist anymore as a usable item. When Snorlax falls asleep don't expect it to survive long enough to wake up again. Abilities Immunity: The Pokémon cannot be under the POISON condition while having this ability. Thick Fat is better but if your team has those two types covered Immunity is kinda usable I guess. Thick Fat: Fire and Ice-type moves deal 50% damage. Amazing. Effectively gives you two resistances and an immunity. Not much but it means crazy sun boosted Fire Blasts won't rip Snorlax in two. Hidden Ability (Not Available): Gluttony: A held Berry is eaten earlier than usual when HP is low. You should never use this for any reason. Movesets Your path is blocked! - Return / Body Slam - Pursuit - Earthquake - Fire Punch / Wild Charge / Crunch / Selfdestruct / Ice Beam / Fire Blast Item Attached: Assault Vest Ability: Thick Fat EVs and Nature: EVs: 252 Atk / 4 HP / 252 SDef Adamant Nature Snorlax has fallen hard from his glory days. For a time it was the most powerful non-Legendary Pokemon in the game. Sadly the move "Close Combat" alongside a bunch of Fighting types has all but guaranteed Snorlax will never again be a Pokemon seen in standard play. It does have a few sets that BARELY make him usable. Assault Vest gives Snorlax a decent niche as a bulky Pursuit trapper. The vest gives Snorlax a 1.5x boost to its Special Defense. That boost is so great that Gengar Focus Blast is only a 4HKO, Specs Latios Draco Meteor a 5ish-HKO, and Greninja's omni-STAB hits practically bounce off your quivering hide. This gives you ample time to Pursuit them into oblivion. Against a foe that might not immediately switch you can try your luck with a number of other offensive moves. Return is your primary power move; one that is strong enough to wipe out Greninja in one shot and Latias/os in two. If you are looking for more utility Body Slam is an option for its annoying 30% Paralysis chance. The damage drop is noticeable though. Pursuit is the most important move on the set. It crams a hot load of inescapable destruction into the nearest Psychic or Ghost type so your favorite Fighting type can sweep. Ironic that Snorlax's only remaining use is to help the same Fighting types that put him out of a job. Earthquake hits Heatran, Lucario, and Aegislash but don't expect to beat the latter two very often. The last slot can be pretty much anything. Fire Punch strikes the one Pokemon you actually outspeed, Ferrothorn, for massive damage. It also hits Forretress and Scizor but won't outdamage Return and Earthquake on much else. You 4HKO Skarmory AT BEST so don't try using it for that. Wild Charge blitzes Gyarados who think you can't hurt them but otherwise hits nothing of value. Crunch strikes down Gengar in one hit and prevents it from Disabling your only means of hitting it (Pursuit). Selfdestruct has 300 base power with STAB, more than enough to wipe out anything that isn't particularly bulky. Ice Beam is a weird option but it does about 70-80% to Landorus and Gliscor which is significantly more than anything else Snorlax can muster. If you run Ice Beam use a Brave Nature instead of Adamant. It actually makes a decent fighting type partner as it can Pursuit Ghost and Psychics and lure in and Ice Beam physical flying walls. Fire Blast is a final option to hit the same things Fire Punch hits but also catch Skarmory for a solid 2HKO. It also OHKOs Forretress and max HP Scizor as well as 2HKOing Ferrothorn. It is a light slap against anything else though. Snorlax woke up! It attacked in a grumpy rage! - Return / Double Edge - Pursuit - Earthquake - Fire Punch / Wild Charge / Crunch / Selfdestruct Item Attached: Choice Band Ability: Thick Fat EVs and Nature: EVs: 252 Atk / 252 HP / 4 SDef Adamant Nature Band Snorlax runs a slightly different route. While it still can tank Special hits well enough slapping a Choice Band on Snorlax turns it into a pretty solid hitter. Choice Band Return hits like a truck easily smashing aside all but the most physically bulky Pokemon in 1-2 hits. Double Edge hits even harder possibly 2HKOing even some Hippowdon. The recoil can be a pain but the damage difference is noticeable while the high HP sponges a bit of the recoil. Pursuit has pretty decent power when boosted by Choice Band. So much so that it can OHKO a Gengar that stays in provided it takes Stealth Rocks and Sandstorm damage. Earthquake brushes aside most grounded Steel types and exterminates Tyranitar. Fire Punch wrecks Scizor and his 4x weak friends but also has enough power to force Skarmory to switch out. It doesn't 2HKO but it does enough to keep Skarmory spamming Roost until it is either burned or crit. Wild Charge on the other hand is a solid 2HKO on Skarmory while also demolishing Gyarados. Crunch pounds Lati@s into the ground and cuts a significant chunk of Celebi's life away. Selfdestruct deals incredible damage, doing up to 85% to even 252/252 Impish Hippowdon. With proper use of Pursuit and Selfdestruct you can possibly manage two KOs per game... if you're lucky. Other Options Curse, Belly Drum, Whirlwind, Superpower, Rest + Sleep Talk, and Seed Bomb If you would like to relive some of Snorlax's glory days Curse can be used. The problem is it lacks enough coverage with its remaining moveslots to attempt much of a sweep. It's also incredibly vulnerable to Close Combat and Trick, both of which are on nearly every team. It also loses to things like Swords Dance Landorus and Mega Kangaskhan that simply boost up faster than Snorlax can. Belly Drum is bad and should never be used. Lets just get that one out of the way. For some reason Snorlax learns Whirlwind (... but not Slack Off). It's decent on something with those stats but if you're trying to use Snorlax as a wall or phaser you should really be using something else. Superpower blasts Tyranitar hard who might actually try and switch in on you. It doesn't hit much else that your other moves don't already however. Rest + Sleep Talk is decent but with the popularity of Spikes and the general power increase Snorlax simply lacks the ability to survive for three turns. Seed Bomb hits Rotom W and Quagsire but literally nothing else worth noting. Double & Triple Battle Options Slow and fairly bulky. Selfdestruct is his main selling point. There are slightly too many Close Combats floating around for Snorlax to be a top tier threat but if you want to run it Snorlax can probably make its mark on the opponent. Countering Snorlax Lucario is probably the biggest counter. Close Combat utterly ravages the tubby bear even if it has used Curse a few times. Snorlax can do little back besides hope he can catch Lucario on the switch in with Earthquake. Aegislash takes very little from anything, even Earthquake, and Swords Dances in Snorlax's face. Secret Sword ignores stat boosts so Snorlax can't even try to Curse up. Skarmory doesn't care much about what Snorlax does and Spikes, then Whirlwinds you away. Spike damage will add up and Snorlax has no good ways of healing itself. Ferrothorn does a similar job but with Leech Seed sapping away at Snorlax's gargantuan HP. Scizor laughs at Snorlax as well as long as it doesn't take a Fire Punch. Terrakion crushes Snorlax with Close Combat. In fact anything that is of the Fighting type and learns Close Combat is a counter. Really anything with Superpower. Even some things with Outrage like Kyurem and Garchomp are counters because Snorlax just can't take many physical hits (not even with Curse). The best way to beat Snorlax is to try and play carefully around Pursuit by using your Ghosts / Psychics sparingly. Then just send in whatever on your team can take a normal move and deal significant damage back... which is basically every Physical attacker in the game. Snorlax was once a glorious golden standard of power back in the day. But over the next 15 years it didn't really change much at all. It watched as the world passed it by. Snorlax is an outdated relic of the past and is not something you have to prepare for, but something you will just passively beat because its tactics are so primitive that they won't have much of an effect on most modern strategies. Pre-Evolution Corner - Munchlax Living the Lax-urious Life -Recycle -Return -Power-up Punch / Fire Punch -Earthquake / Pursuit Item Attached: Berry Juice Ability: Thick Fat EVs and Nature: EVs: 236 Atk / 36 Def / 236 Sp. Def Adamant Nature (+Atk, -Sp. Atk) Munchlax is a Pokemon in Little Cup that has a lot of potential. It has a very high base stat total (especially for a baby), very high hp, very good movepool, and a variety of different things it can pull off. The downsides though are very exploitable. Even with high Health and Special Defense, its physical defense is very poor and its speed is tied for the lowest speed of ALL Pokemon. It does take a commendable amount of time to kill though and this set is based around that feature. Recycle + Berry Juice is way too good and makes Munchlax extremely potent. Return hurts coming from a good Attack stat. Power-Up Punch is one of the few new things the Snorlax line got. Fire Punch hits Steels for solid damage, and usually hits harder than Power-Up Punch. Earthquake hits Rock-Types hard and is good for coverage. Pursuits is good for frail Psychic-Types if they try to switch out. Thick Fat just gives it more resistances in Layman's Terms. The EV’s and nature max out Munchlax’s good Attack and making it tankier. Fighting-Types still destroy Munchlax, as its Defense is really bad. Toxic and burns destroys Munchlax as well. If left alone however, without many physical attackers around, Munchlax can wreck teams. Stop Cursing. Try to Re-Lax -Curse -Rest -Sleep Talk -Body Slam / Return Item Attached: Eviolite Ability: Thick Fat EVs and Nature: EVs: 236 Atk / 36 Def / 236 Sp. Def Adamant Nature (+Atk, -Sp. Atk) / Careful Nature (+Sp. Def, -Sp. Atk) To deal with Munchlax’s bad Defense and tankiness Curse with Restalk is brilliant. It does take a bit to set-up but once it gets going it is incredibly dangerous and impossible to kill. Curse boosts Munchlax’s Attack and Defense, and lowers it’s pointless Speed, since it is literally outrun by everything. Rest gets rid of bad status effects and always brings it up to full health. Sleep Talk still lets Munchlax attack while sleeping or set up more Curses. Body Slam is good because it can hinder enemy Pokemon as well as doing damage with Paralysis which in some cases lets Munchlax have it’s dream come true: moving first. Return is a more damage alternative, but both are good. Thick Fat again gives it retroactive resistances to Ice and Fire moves. EV’s and Nature are the same as the first set, but you could use a Special Defence boosting Nature if you know you can set up more Curses. A bit of momentum to start up up this Munchlax set, and it really it’s hard to stop without a few critical hits. Locations in Games Ruby/Sapphire/Emerald: Trade from FireRed/LeafGreen/XD FireRed/LeafGreen: Routes 12 & 16 Colosseum/XD: Snagged from Cipher Admin Ardos in Citadark Isle (XD) Diamond/Pearl/Platinum: Evolve Munchlax HeartGold/SoulSilver: Route 11/12 Black/White: Evolve Munchlax Black 2/White 2: Trade from Yancy/Curtis X/Y: Route 7
import { AlfredError } from '@/project'; import { Item, List } from '@/workflow'; import compose from 'stampit'; export const Label: todoist.LabelFactory = compose({ /** * @constructor * @param {Label} label A new label */ init(this: todoist.LabelInstance, label: todoist.Label = { name: '', id: -1 }) { if (!label.name && label.name === '') { throw new AlfredError(`A label must have a name (${label.name}) property`) } if (!label.id || label.id === -1) { throw new AlfredError(`A label must have a id (${label.id}) property`) } Object.assign(this, label) } }) export const LabelList: todoist.LabelListFactory = compose( List, { init( this: todoist.LabelListInstance, { labels = [], query }: { labels: todoist.Label[]; query: string } ) { labels.forEach((label: todoist.Label) => { this.items.push( Item({ title: label.name, subtitle: `Add label ${label.name} to task`, autocomplete: `${query.replace(/(^.*@).*/, '$1')}${label.name} `, valid: false }) ) }) } } )