Commit History
misc fixes
		d75adb9
	
		
		
	Fixed pre-commit problems, fixed small bug in logging_config to handle LOG_LEVEL env var
		b1f4f7a
	
		
		
	Adding logging enhancement
		553a86b
	
		
		
	Merge pull request #92 from OpenAccess-AI-Collective/flash-optimum
		16bb627
	
		
		unverified
	chore: Refactor inf_kwargs out
		dc77c8e
	
		
		
	Merge branch 'main' into flash-optimum
		fd2c981
	
		
		unverified
	Merge pull request #177 from NanoCode012/fix/landmark-patch
		8002ffb
	
		
		unverified
	Merge pull request #159 from AngainorDev/patch-1
		8e568bb
	
		
		unverified
	Fix strict and Lint
		b565ecf
	
		
		
	Fix set mem_id for inference and refactor
		974dc00
	
		
		
	Set mem cache args on inference
		572d114
	
		
		
	fix formatting
		958da70
	
		
		
	pass a prompt in from stdin for inference
		c4e4f81
	
		
		
	address PR feedback
		0c6f928
	
		
		
	add streaming dataset support for pretraining datasets
		eea2731
	
		
		
	more tweaks to do pre-training with bettertransformers
		1210dc8
	
		
		
	experimental expansion of ctx len
		488a67d
	
		
		
	add flash attn context for efficient training and attempt setting model to train mode:
		8792199
	
		
		
	add support for opimum bettertransformers
		1edc30c
	
		
		
	Merge branch 'main' into patch-1
		79e2a6f
	
		
		unverified
	
		Angainor Development
		
	commited on
		
		
Remove explicit definition of cfg.inference
		c250898
	
		
		unverified
	
		Angainor Development
		
	commited on
		
		
formatting for linter
		f36e227
	
		
		unverified
	Add streaming inference & fix stopping at EOS
		fec6bcc
	
		
		
	Feed cfg.inference
		bd3b537
	
		
		unverified
	
		Angainor Development
		
	commited on
		
		
 
		