Response
stringlengths 8
2k
| Instruction
stringlengths 18
2k
| Prompt
stringlengths 14
160
|
---|---|---|
0
First of all - I guess some of your parameters are wrong:
ServerConnection connection = new ServerConnection(serverName);
Here, you need to pass just the server's name - so in your case, do not send in your whole connection string - just .\SQLExpress
As for your database name - I don't know if you can use SMO to backup an "attached" MDF file in SQL Server - normally, this would be the database name (the name of the database only - no file name, no extension) when the database is on the server.
string dataBaseName = @"C:\database\mydb.mdf";
So my suggestion here would be:
attach this MDF file to your SQL Server instance (which you have installed anyway)
give it a meaningful name, e.g. mydb
then use just the database name as your value here:
string dataBaseName = "mydb";
With these points in place, your code does work just fine in my case, at least...
Share
Improve this answer
Follow
answered Aug 28, 2011 at 18:20
marc_smarc_s
742k177177 gold badges1.4k1.4k silver badges1.5k1.5k bronze badges
2
@j4m4l: use SQL Server Mgmt Studio Express (a free download from MS) to attach your MDF to your server
– marc_s
Aug 28, 2011 at 18:25
@j4m4l: look at the SMO method AttachDatabase (sample found on MSDN )
– marc_s
Aug 28, 2011 at 18:38
Add a comment
|
|
I am not able to create a backup of database saved in location like C:\database\mydb.mdf
error : Unable to create a backup
Backup sqlBackup = new Backup();
sqlBackup.Action = BackupActionType.Database;
sqlBackup.BackupSetDescription = "ArchiveDataBase:" +
DateTime.Now.ToShortDateString();
sqlBackup.BackupSetName = "Archive";
sqlBackup.Database = databaseName;
BackupDeviceItem deviceItem = new BackupDeviceItem(destinationPath, DeviceType.File);
//ServerConnection connection = new ServerConnection(serverName, userName, password);
ServerConnection connection = new ServerConnection(serverName);
Server sqlServer = new Server(connection);
Database db = sqlServer.Databases[databaseName];
sqlBackup.Initialize = true;
sqlBackup.Checksum = true;
sqlBackup.ContinueAfterError = true;
sqlBackup.Devices.Add(deviceItem);
sqlBackup.Incremental = false;
sqlBackup.ExpirationDate = DateTime.Now.AddDays(3);
sqlBackup.LogTruncation = BackupTruncateLogType.Truncate;
sqlBackup.FormatMedia = false;
sqlBackup.SqlBackup(sqlServer);
string dataBaseName = @"C:\database\mydb.mdf";
string serverName = @"Data Source=.\SQLEXPRESS;Integrated Security=True;User Instance=True";
string destinationPath = "C:\\mydb.bak";
Maybe I am passing wrong variables?
Please can anyone verify it and post me the right solution
thnx in advance.
PS: database is not password protected and can use mixed authentication
|
backup restore sql database through AttachDBFilename
|
0
Haven't used it personally, but this gem seems useful: https://github.com/soundevolution/rails-backup-migrate
Share
Improve this answer
Follow
answered Aug 19, 2011 at 11:40
AntAnt
3,87711 gold badge1616 silver badges1414 bronze badges
Add a comment
|
|
Rails has db:migrate, db:create, etc to help rebuild the structure of db, but currently I didn't find any solution that could easily backup & restore on another machine.
We need use some Mysqldump MySQL command, it's a little boring. I would like suggestions for other methods.
|
Is there some easy way to backup MySQL then restore to another environment on rails3?
|
0
I didn't completely understand the question, but here you go (open source solution??):
Backing up and restoring iOS devices: iCloud, Wireless Syncing in iOS 5, get a beta copy by becoming a paid registered developer or wait until the fall
Backing up and restoring Macs: Time Machine, buy a Time Capsule, connect a hard drive directly to the Macs and Apple makes it REALLY easy to back up and restore, I love my Time Capsule, OR buy a Mac Server instead of a Time Capsule and back up all of your Macs to that
Backing up and restoring PCs: Well, since it's going to crash anyway, you better look into this but I'm not sure about PCs
Share
Improve this answer
Follow
answered Aug 15, 2011 at 19:48
Jack HumphriesJack Humphries
13.2k1414 gold badges8484 silver badges128128 bronze badges
0
Add a comment
|
|
Does anybody know, if there is an open source solution available
to backup/restore files/directories from iPhone to Mac/PCs?
I thought using the internal iPhone Web Server and a connect
via browser might be the standard procedure ?
Any help would be appreciated ..
Thanks Matthias
PS: I just wanna implement a backup feature in one of my iPhone Apps and
I'm looking for a ready to run open source library / source code samples
which demonstrate the file transfer (a sqlite + a few image files,
stored in the document path of my iPhone app)
to the file system of a client (MAC/PC).
May be with on the fly data compression ?
1. starting the internal Web Server in my iPhone app
2. client connect (browser) to the iPhone web server
3. download files to the client.
|
Standard backup/restore using the internal iPhone Web Server
|
I wasn't able to get the Backup Plan Wizard working with XP Home, so I went ahead and manually set up the backups using the instructions given in the link above. It actually turned out to be pretty simple to do (a lot of steps, but they're all very easy). My first automatic backup ran overnight, and I was able to successfully restore the TFS databases. I've also set things up so that the backup files (full, incremental, and transaction logs) are saved to my dropbox folder.
|
I recently installed TFS 2010 on my laptop (Windows 7, 64-bit) for source control only (I don’t need reporting, Share Point, etc.) for some single-developer projects that I’m working on. I’m pretty happy with the source control, but I’m trying to figure out the best strategy for backing it up in case of a drive failure, fire, theft, etc.
I tried using the Power Tools Backup Plan Wizard, but I kept getting a “Grant Backup Plan Permissions” error saying that the account did not have the required permissions to create backups on the backup path. I suspect that the problem might be that I’m trying to back-up to an XP Home machine, and I think I need to have a machine that is on a domain. (I realize this isn’t an ideal setup, but it’s what I’ve got.)
So I’ve got a few specific questions, but I’m also open to other suggestions.
Is there a way to use the Backup Plan Wizard to back up to an XP Home machine?
If I can’t make the Backup Plan Wizard work, what about just backing up the SQL Server databases that TFS uses? The article here http://msdn.microsoft.com/en-us/library/ms253070.aspx says that you need to use marked transactions to keep the data consistent, but I’m wondering if I could skip that and still make it work (since I’m the only developer and I could make sure that I’m not logged into TFS while the backup is running).
Thanks.
|
Best way to back up a local install of TFS 2010 for a single developer
|
BACKUP DATABASE databasename TO DISK='C:\somefile.bak' WITH COPY_ONLY, INIT, FORMAT, CHECKSUM
Obviosly replace databasename and the target C:\somefile.bak as appropriate. Remember, the file and path is on the server; connecting remotely won't change where backup file is stored--in other words it won't be on your local machine.
You can omit the WITH options if you want. Drop INIT if you don't want the target .bak file overwritten. COPY_ONLY isn't a big deal either way in your case. CHECKSUM is just for validating the data before it gets backed up, and may not matter if you don't have CHECKSUMs turned on for the database--though by default starting in MS SQL 2005 new databases were.
The MS documentation for BACKUP and RESTORE isn't too difficult to understand in its basic forms. You can also use the Management Studio Tasks->Back Up or Tasks->Restore GUI if you have access to it.
|
I am installing a service pack on our shopping cart. They recommend backing up the SQL database before installing. I know we have backups to tape drives done by our hosting company, but I want one I know the exact time stamp for and can access quickly if I need to reload it because of a goof during the upgrade. (I don't want to have the store down for any longer than needed.)
How do you recommend backing up a SQL database for easy reloading for someone who is used to just writing queries and stored procedures? (I'd like to get everything - mappings & indices, etc - because I wouldn't know what all of them are or how to recreate them.)
I access the database via Remote Desktop and can link my hard drive and DVD drives, if that helps. It's MSSQL 2008.
Thank you so much.
Best wishes,
Andrea
|
how to backup SQL database (tables, mappings, indices, etc.)
|
0
this messes up the encoding (even though I save as utf8)
UTF-8 is not a good choice for arbitrary binary data. There are many sequences of high-bytes which are not valid in UTF-8, so you will mangle them at some point during the load-alter-save process.
If you load the file using an encoding that maps every single byte to a unique character, and re-save the file using that same encoding, you should preserve the original content(*). ISO-8859-1 is the encoding usually chosen for this purpose, since it simply maps each byte 0..0xFF to the Unicode code point with the same number.
(*: assuming the editor is binary-safe with regard to other tricky points like nulls, \n/\r and other control characters... I believe EmEditor can be.)
Share
Improve this answer
Follow
answered Jul 12, 2011 at 21:28
bobincebobince
532k108108 gold badges661661 silver badges837837 bronze badges
Add a comment
|
|
There is a MySQL backup file which is a huge file - about 3 GB. There is one table that has a LONGBLOB column that stores JPEG image data.
The file imports successfully if done from MySQL Workbench - Data Import/Restore.
I need to open this file and extract the first few lines (about two rows of INSERTs of the table with the image data) so that I can test if another program can import this data into another MySQL database.
I tried opening the file with EmEditor (which is good at opening large files) and then copy/paste only upto one Insert statement of the script into a new file (upto about line 25, because the table in question is the first table in the backup script), and then Paste the selection into a new file.
Here comes the problem:
However this messes up the encoding (even though I save as utf8). I realize this when I try to import (restore) this new file (again using MySQL Workbench) into a MySQL database, the restore goes ahead without errors, but the JPEG images in the blob column are now destroyed/corrupted.
My guess is that the encoding is different between the original file and new file.
EmEditor does not show the encoding on the original file, there is an option to detect, and it detects it as 'UTF8 Unsigned'. But when saving I save it as UTF8. I tried also saving as ANSI, ISO8859 (windows default), etc, etc.. but everytime the same result.
Do you have any solution for this particular problem? ie I want to only cut the first few lines of the huge backup file and save to a new file keeping the encoding the same, so that the images (blobs) are not changed. Is there any way this can be done with EmEditor (ie do I have the wrong approach [ie Cut-Paste]?) Is there any specialized software that can do this? How can I diagnose what is going wrong here?
Thanks for any responses.
|
Using EmEditor saving a Unicode file to another format distorts/changes the format. Solution?
|
0
Instead of deleting the app to make changes, install an updated app with the old app in place to preverve the content of the Documents folder (NSDocumentDirectory).
You could also save the data in the cloud whenever the app exits.
Share
Improve this answer
Follow
answered Jun 30, 2011 at 7:49
Niels CastleNiels Castle
8,0583636 silver badges5858 bronze badges
2
save the data in the cloud? Can you explain this?
– cyclingIsBetter
Jun 30, 2011 at 7:56
Instead of storing your data in the application that gets deleted, save any data outside the application e.g. somewhere on the internet/cloud/sprawl.
– Niels Castle
Jul 2, 2011 at 14:05
Add a comment
|
|
I'm creating an app and when a person use this app, lot informations are recorded in a txt file;
If I want to make changes at this app, I must delete app from device. Then, how can I save the txt file before I delete the application?
Is there a way to do this?
|
IOS: save a csv o txt file when I delete an app
|
0
Yes you can, please refer to Exchange 2007 and Windows 2008: Online Exchange Backup (part 6 of 7).
Get the edb, chk & log file paths using get-StorageGroup, get-MailboxDatabase & get-PublicFolderDatabase for Exchange 2007 or get-MailboxDatabase & get-PublicFolderDatabase for Exchange 2010,
and do the usual VSS stuff to copy them.
Make sure to signal VSS backup success to purge the log files.
Share
Improve this answer
Follow
edited Feb 21, 2012 at 0:37
Matt Fenwick
48.6k2323 gold badges128128 silver badges194194 bronze badges
answered Feb 20, 2012 at 18:09
Jimson JamesJimson James
3,06677 gold badges4646 silver badges8484 bronze badges
Add a comment
|
|
Can Exchange Server 2003 and 2007 be backup up using VSS API's that are provided for exchange 2010?
Thanks
|
Backup of Exchange Server 2003,2007 using VSS API's
|
RoboCopy seems to do the trick requiring just a couple of lines of script.
|
I looking for a good solution to backup and rollback a folder using the Windows(XP/Vista/7) OS.
As an example, say, I have a folder, called "\SOMEOTHERPC\Destination" (that contains files and subfolders). I wish to take a back up of this folder to my local disk. Then, say I was to edit or delete some of the content of this "\SOMEOTHERPC\Destination" folder and the afterwards rollback the folder to its previous state.
What's the best way to go about this. I am thinking of writing a PowerShell script? Perhaps there is a better way. Any suggestions appreciated?
Thanks.
|
Back and Rollback Script for a File Transfer
|
0
I'm reasonably certain that the creation of standby databases is a feature of Enterprise Edition only.
Share
Improve this answer
Follow
answered May 21, 2011 at 4:40
Adam MuschAdam Musch
13.4k22 gold badges2929 silver badges3232 bronze badges
Add a comment
|
|
i was trying to prepare a primary database for standby database creation. while executing set_log_params.sql, we struck with an error "ORA-32017 failure in updating SPFILE" and "ORA-00439 feature not enabled: managed standby".
from the detailed error message, it has been seen that the error when setting "log_archive_dest_2" parameter.
we are using oracle 11g standard edition with RHEL 5
It will be appreciable if you could tell me if there will be an easy way to setup a backup database server.
thanks and regards
Jayalaxmi
|
can not prepare primary database for standby database creation (for RMAN)
|
0
You should place you sql files in Data folder of application. so you do not need to set data folder in your application.
then you can do any thing you want. just right click on your project and add a data folder. specially if your project is an asp.net project.
bu the easiest way is to set your connection string like this :
<add name="FamilySystem.Properties.Settings.FHDBConnectionString"
connectionString="Data Source=.\SQLEXPRESS;AttachDbFilename=C:\MyApp\FHDB.mdf;Integrated Security=True;User Instance=True"
providerName="System.Data.SqlClient" />
Share
Improve this answer
Follow
edited Apr 16, 2011 at 7:27
answered Apr 16, 2011 at 7:16
Farzin ZakerFarzin Zaker
3,60633 gold badges2525 silver badges3535 bronze badges
3
?Database 'FHDB' does not exist. Make sure that the name is entered correctly. BACKUP DATABASE is terminating abnormally." i tried it :)
– special life
Apr 16, 2011 at 7:21
can you describe your project folder structure. also asp.net or win application?
– Farzin Zaker
Apr 16, 2011 at 7:22
the project is Win app and my DB in Solution folder for my project i add it to my solution by "Add Existing item"
– special life
Apr 16, 2011 at 7:27
Add a comment
|
|
Now i mad an Application with VS.net C# 2010 express edition and using SQL express also,
i used in my solution data explorer to create and mange tables of DB and this is my connectionstring:
<add name="FamilySystem.Properties.Settings.FHDBConnectionString"
connectionString="Data Source=.\SQLEXPRESS;AttachDbFilename=|DataDirectory|\FHDB.mdf;Integrated Security=True;User Instance=True"
providerName="System.Data.SqlClient" />
and i place my DB in that Directory :
AppDomain.CurrentDomain.SetData("DataDirectory", "C:\\MyApp");
1-but now if i want to make backup for the DB the error appears and say : this database doesn't exists, so i can't also make restore for it.
2-what is the bet location to place my database and what about the "Copy to output Directory proprieties" what is the best value for it??
i want to know what the wrong because i have stucked with that since month ago and i can't figure out how to solve it really i need your help and support :S
thanks in advance if you want to know more tell me
|
i am stuck with SQL Database and its location (In Windows Application Project C# 2010)
|
Have you tried setting the device as "online" after enabling it?
|
I have been trying to set up a back up on Symantec Back Up Exec 10d and it is not working. I have the external HD formatted into a NTFS and Mounted to the O:\ drive but it does not show up in the "Devices" tab. I added it as a removable storage device, but it only created the folder with nothing else in it. Everytime I run the test I receive the error saying "No online Media". Does anyone know what could be causing the device not the register?
|
Setting Up a Backup on an External HD with Backup Exec
|
0
For standard master-slave MySQL cluster setup, "cluster backup" is backup of the master.
And yes, it's possible to do it with Workbench.
Share
Improve this answer
Follow
answered Apr 4, 2011 at 10:01
vartecvartec
133k3636 gold badges221221 silver badges246246 bronze badges
2
Do you have any link to do the cluster backup ?
– Rajesh22
Apr 4, 2011 at 10:03
Yes i read you answer.When comes to cluster backup we used to take backup using ndb_mgm dev.mysql.com/doc/refman/5.0/en/… Like same whether we have steps to do the same on mysql work bench ? if you have please provide me the url to refer
– Rajesh22
Apr 4, 2011 at 10:15
Add a comment
|
|
Using MySQL Workbench, is it possible to take cluster database backup ?
|
Is it possible to take cluster database backup using MySQL Workbench?
|
Of course -- use MySQL Dumper. You can automatically backup your databases to another host if you like!
Features
Send dumpfiles via FTP to up to 3 different server. This is also working using the multipart feature.
Automatic file-deletion: set your own rules to delete old backups. Specify the number of backups you want to hold and let MySQLDumper automatically delete the older ones to save server webspace.
MySQLDumper can do Multipart-Backups. That means: it can automatically split the dumpfile if it gets bigger than your chosen size. When you want to restore a backup and choose the wrong part - it doesn' matter: MySQLDumper will notice that and will get the correct startfile automatically.
Security: MySQLDumper can generate a .htaccess-file to protect itself and all of your backup-files
Good reading resource for alternatives
10 Ways to Automatically & Manually Backup MySQL Database
|
Closed. This question does not meet Stack Overflow guidelines. It is not currently accepting answers.
This question does not appear to be about a specific programming problem, a software algorithm, or software tools primarily used by programmers. If you believe the question would be on-topic on another Stack Exchange site, you can leave a comment to explain where the question may be able to be answered.
Closed 10 years ago.
Improve this question
So the only method i like using and think is the simplest to use is mysqldump to backup mysql databases. Right now, im using phpmyadmin to backup the tables. Is there any way i can code a script that does it automatically (preferably everyday).
And how do i back up files exactly on my server. I have an images file that i need to back up. I'm not sure exactly how to go about backing those up.
|
Two questions about backing up your website (mysql and files) [closed]
|
When you restore a database backup you need to go into the restore options and update the paths of the MDF/LDF files to wherever you want them on the new machine. The default is to keep these paths the same and obviously they may not exist on the new machine.
|
why sql server 2005 so sucks?
i should work on an old system that using sql server 2005!
my problems are :
1-when i detach a database and move mdf / ldf files of that db to another drive , so i can
not attach these files because of moving!
what is the best way for change the location of mdf/ldf files to another drive?
==============================================================================
2-when i backup a db that mdf/ldf of that db are on drive e , so i can not restore that bak file to another system that has not drive e / what is going on about sql server 2005 / i test this job on sql server 2008 and every thing was ok / how can i solve this issue?
i am using windows 7 ultimate / sql server management studio express edition /
thanks in future advance
best regards
|
sql server 2005 (for old systems) and many problems for attach and restore...(changing drive location)
|
0
Be careful with instance store, your instance if terminated will restore your data. I suggest you put the important data to an EBS volumes.
Please see my post http://www.capsunlock.net/2009/12/create-ebs-boot-ami.html
It's possible to clone the current instance and make an EBS backed AMI.
Share
Improve this answer
Follow
answered Feb 14, 2011 at 22:16
Rodney QuilloRodney Quillo
3,36211 gold badge1919 silver badges99 bronze badges
Add a comment
|
|
I currently have an amazon instance (Medium - High CPU) running off the instance store with most of my data and code sitting in /mnt mounted to sda2. The instance is just the way i need it to work. How can I clone this instance and make an exact copy (data and all) to another (preferably cheaper, micro) instance for testing my new code changes? Also what backup suggestions are recommend for this setup?
Thanks
|
Cloning an Amazon Linux Instance
|
Perhaps this one: Hobocopy
|
I am looking for a program that uses shadow copy to copy the contents of a Windows XP system volume that is running.
I.e. I want to clone the system volume with the following snags:
(1) I want to be able to select which files to copy (i.e. not the entire file system)
(2) This is probably implied by (1), but I also have to avoid sector-by-sector copies
(3) I do not want to clone a file system into an image file and restore to a 3rd drive but want to do a filesystem to filesystem copy
All the backup/clone utilities I looked into stumble on one of above points. Any ideas?
|
Shadow copy to clone system volume on Windows XP
|
0
If it fails by showing a blank screen only then it likely means PHP ran out of time or memory, either way reducing the size of the import would help.
You can often save a lot of space by clearing logs prior to the export. Also, you can export-import some tables separately to spread the work over two operations, phpMyAdmin has a table selection box on the export page.
Share
Improve this answer
Follow
answered Jan 15, 2011 at 18:28
clockworkgeekclockworkgeek
37.7k99 gold badges9090 silver badges127127 bronze badges
Add a comment
|
|
I have a large database (about 244 MB+) and now I want to transfer it to other location from production to development server for fixing the bugs, upgrading and also backup. The hosting provider is Siteground.com
When I try to make a backup of magento database using magento backend Backup utility, phpMyAdmin or from Navicat for MySQL, unfortunately it fails each time showing no error.
I need Magento community help. What should I do to accomplish the backup successfully.
Any help would be greatly appreciated!
~Shaman
|
Magento Large Database Backup/Restore Failed
|
The error message given at the top does not match the error message given in the code for some reason.
The error message lacks reporting $ERRNO. See fork(2) for the modes of failure for this system call.
Improve error reporting, then you need not guess about the cause.
|
Suddenly, one server cannot be backed up. i get a strange error message:
2011-01-04 10:10:37 host1: Can't fork at /usr/share/backuppc/lib/BackupPC/Lib.pm line 1128.
What does this error mean?
All other hosts (with same OS) don't have this problem.
Thanks in advance for any reply. :)
$cmd = [split(/\s+/, $cmd)] if ( ref($cmd) ne "ARRAY" );
print(STDERR "cmdSystemOrEval: about to system ",
$bpc->execCmd2ShellCmd(@$cmd), "\n")
if ( $bpc->{verbose} );
if ( !defined($pid = open(CHILD, "-|")) ) { # <<<<<<<<< 1128
my $err = "Can't fork to run @$cmd\n";
$? = 1;
$$stdoutCB .= $err if ( ref($stdoutCB) eq 'SCALAR' );
&$stdoutCB($err) if ( ref($stdoutCB) eq 'CODE' );
return $err if ( !defined($stdoutCB) );
return;
}
binmode(CHILD);
if ( !$pid ) {
#
# This is the child
#
close(STDERR);
if ( $ignoreStderr ) {
open(STDERR, ">", "/dev/null");
} else {
open(STDERR, ">&STDOUT");
}
alarm(0);
$cmd = [map { m/(.*)/ } @$cmd]; # untaint
#
# force list-form of exec(), ie: no shell even for 1 arg
#
exec { $cmd->[0] } @$cmd;
print(STDERR "Exec of @$cmd failed\n");
exit(1);
}
|
backupc error: Child exited prematurely
|
The cronjob might do the following before committing:
cd /working-copy
svn status | grep ^\? | awk '{print $2}' | xargs svn add
This approach only works, if you're not interested what's going to be added, though. It'll add all new files.
EDIT: "Edwin Buck" pointed out that svn status only lists new directories, not their files. svn add adds a new directory recursively (including files), though.
|
I'm using a svn repository for the backup of specific folders of my server.
I'm using cron to commit the changes regularly.
However, I was just wondering if it takes in consideration new added files as well. Or just the changes to existing files.
If so, what should I do ? Abandon the idea to use svn for backups ?
|
svn as backup system on ubuntu: what about new added files?
|
0
I think there is no proper solution for a common case.
Installer - is just a program which places your files into suitable directories and performs some additional actions, including registry manipulations.
As far as I know there no "message" for installers to be sent before or after installation.
Share
Improve this answer
Follow
answered Dec 23, 2010 at 10:48
EugeneEugene
3,37733 gold badges3636 silver badges4545 bronze badges
Add a comment
|
|
I want to backup my system Register when a application is planing to be installed or uninstalled, what message will send from installer? Thanks in advance.
|
What message will send when a application is installed or uninstalled
|
The solution is to set in gpedit.msc at Computer Configuration\Administrative Templates\System\Scripts the key Maximum wait time for Group Policy scripts to 0. The default time is 600 seconds (10 minutes).
|
I'm trying to lunch ntbackup on shutdown but the backup file (.bkf) is only 1.1 GB instead of 2.2 GB. When i try to restore using this file ntbackup tells me that it is corrupted. If i run the same command as bat file it works (the command is: ntbackup backup C:\ systemstate /m normal /f "X:\Backup_20-12-2010.bkf" /j "Bckp_Data-and-System" /L:s ). The log file tells me that everything is ok. What can i do?
|
NTBackup problems on shutdown
|
0
Hmm.. Possible but which database are you using? If you are using JavaDb or Apache Derby you already have all the tools you need. You can make a JDBC call to make backups. Here is the code:
String sqlstmt = "CALL SYSCS_UTIL.SYSCS_BACKUP_DATABASE(?)";
CallableStatement cs = conn.prepareCall(sqlstmt);
cs.setString(1,"D:/dbbackups/");
cs.execute();
cs.close();
And for scheduling the backup tasks you can make use of Quartz. It is free, opensource and a good job scheduler. If you use some other DB you can still call the command using the System.exec method from the scheduler task.
Share
Improve this answer
Follow
answered Nov 25, 2010 at 8:44
Abdel Raoof OlakaraAbdel Raoof Olakara
19.3k1111 gold badges8989 silver badges134134 bronze badges
2
Making a copy of local data is just 10% of backup job. Normally a backup process also includes "rolling" backup archives, transferring them to other servers, making Amazon S3/EBS copies, etc..
– yegor256
Nov 25, 2010 at 8:55
@Vincenzo , I don't think there is a framework to automate the entire process
– Abdel Raoof Olakara
Nov 25, 2010 at 9:44
Add a comment
|
|
I'm trying to find a Java library/framework, which I can add to my WAR and enable regular backup of files and databases (started on my own timer). I don't want to use a standalone solution for backup (located/maintained apart from my WAR), mostly because it's bigger maintenance headache.
Do you know any such libraries?
|
Is there any embedded database/files backup framework for Java?
|
You possibly did nothing wrong at all; mysqldump has a bug where it can doubly-encode utf8 data. Check the dump file byte-by-byte - does it look like this is what's happened?
|
I have been making regulate backups of my forums database each and every day. Today I had to revert the forums to an earlier backup.
Command I use to backup my databases:
mysqldump --opt -hMY_HOST -uUSERNAME -pPASSWORD DATABASE_NAME 2> error.txt | gzip > DATABASE_NAME_2010_11_06_14_38.gz 2> error.txt
I used phpMyAdmin to import the database.
After I reverted the forums all the Unicode chars where converted to Garbage.
Example: http://www.everydayfiction.com/forums/index.php/topic,2376.msg11198.html#msg11198
How do I convert these garbage chars back in to there Unicode version or an equivalent ASCII version?
What did I do wrong when exporting or importing the Backup file?
|
Reverted a database and the Unicode chars got converted to garbage
|
You may install cygwin on your Windows machine, and use simple sh script, like the following one:
#!/bin/bash
for i in $( ls ); do
tar czf $i.tar.gz $i
done
# specify your destination here:
rsync *.tar.gz /tmp
rm *.tar.gz
BTW, that's not the most straightforward way, I suppose.
|
All sources are on windows OS, and destination backup is on Unix system (we are using Samba).
My source repository is similar to :
-Repository
--Folder1
---Files
--Folder2
---Files
etc...
I would like to get a destination similar to :
-Repository
--Folder1.zip
--Folder2.zip
etc...
After first backup, I only backup files that have changed since the last backup. (or backup if new folders/files have been created).
Is someone know a tool or a script for my backup needs? Can we do that with Robocopy?
|
How to backup source repository and zip destination folders?
|
0
You can use a Maintenance Plan
It will create a job manageble through Sql Server Agent, altenatively you can use SSIS (however the Maintenance Plans use SSIS).
One note, in the configuration of the backup activity you can browse only local path, however you can input the network path manually
Share
Improve this answer
Follow
answered Oct 13, 2010 at 9:47
il_guruil_guru
8,44322 gold badges4444 silver badges5252 bronze badges
Add a comment
|
|
I need to take the backup of database in SQL server on network location through creating a job in SQL server 2005.
can anyone know how to do it?
|
How to take backup of database in SQL server on network location through a job?
|
0
As far as I understand it, you're confusing two things:
A copy operation, which simply is a file system operation where no VCS whatsoever is involved. Otherwise your notion of 'overwrite' doesn't make any sense, because VCS systems are there to avoid exactly that.
A scheduled check-in operation, which can only be initiated from a client machine. Therefore, it of course has to have a SVN client installed.
Thomas
Share
Improve this answer
Follow
edited Oct 11, 2010 at 4:39
answered Oct 10, 2010 at 13:05
Thomas WellerThomas Weller
11.7k33 gold badges2727 silver badges3434 bronze badges
2
having re-read again your correct its confusing so please ignore the copy operation item.
– Darknight
Oct 10, 2010 at 13:10
I see. So what you have in mind is an automatic, scheduled check-in. This must be done by a client, because a version control server never pulls from clients, it only receives push operations. So yes, you need to have SVN clients installed. I'm sure you'll find respective scripts if your searching the SVN homepage subversion.apache.org.
– Thomas Weller
Oct 11, 2010 at 4:47
Add a comment
|
|
I have a windows server, I can install VisualSVN on this machine.
What I would like to do is, use use some kind of batch script (MS DOS .BAT) to remotely backup files to this server over https (I can schedule this script to run nightly).
< ignore > This will always be a one way copy, i.e. what ever is in the respository should ALWAYS be over-written with what ever is in the client machine < / ignore >
Assume I have a local folder on the client machine as such:
c:\Data
Question:
(a) On the client machine(s) would I need to install any kind of SVN client software?
(b) Can anyone help with the batch script to achieve this?
Edit:
I assume SVN will automatically only transfer files that have been updated?
|
use of VisualSVN as a remote backup
|
0
It is suggested to use svnadmin dump and svnadmin load for repository relocation, but a filesystem copy from your restored drive should do the job aswell.
You won't be able to restore any history from your working copies.
Share
Improve this answer
Follow
answered Sep 28, 2010 at 11:50
zelluszellus
9,58255 gold badges3939 silver badges5757 bronze badges
Add a comment
|
|
My windows desktop machine died the other day and i had a local svn server running.
Is it possible to restore the repository just from the repo file structure alone?
I know i should have been backing it up, but i didnt for what ever reason, as the code in it was mostly just playing about and experiments. If the repo cant be restored then its no problem as i have all the working copies locally just like to know if it possible...
|
Restore svn without any backups
|
0
Were I you, I would create a script that did the backup and then sent the backup elsewhere. I know that is kind of what you are asking how to so, but you left out some things that would be good to know, such as what OS are your two systems running?
Of they are both windows, you could mount a network drive and have the backup dump there (or copy the dump there). If they are linux servers I would recommend copying it across using the scp command. If it is a mix then it gets fun and tricky.
If you are working with linux servers, the following guide should walk you through the process of backup. Click me!
If you are still scratching your head after reading that, let me know what kind of OSes you are rolling with and I can provide more detailed instructions.
Good luck!
Share
Improve this answer
Follow
answered Sep 12, 2010 at 9:18
stygmastygma
52355 silver badges1919 bronze badges
1
Hi! Thanks for your comment. The LIVE server environment is Linux while I would like to move the dump files (i.e. copy and delete, without manual intervention) to a Windows environment (my home PC). Also, are there any free tools to automate this process? Thanks!
– TMM
Sep 12, 2010 at 9:35
Add a comment
|
|
I have 2 databases with MyISAM tables which are updated once a week. They are quite big in size (one DB is 2GB and the other is 6GB). I currently back them up once a week with mysqldump and keep the last 2 weeks' worth of .sql dumps on the same server where the DBs are running.
I would like, however, to be able to dump the backups to another server, as they are taking up server space unnecessarily. What is the best way to achieve this? If possible, I would like to keep the databases running during the backup. (no inserts or updates take place during the backup process, just selects).
Thanks in advance,
Tim
|
MySQL - how to backup database to a different server?
|
Im guessing you have an app that backs up data, when a user clicks on something right? I'm thinking get all the info connected to the user(depends on how you did your user model, so maybe you should have a get_all_info method) then write it out in sql format to a file, which you save as .sql. (either using File.new or Logger.new)
|
I need to make a backup system for my rails app but this has to be a little special: It doesn't have to back up all the database info and files in a single file or folder but it has to back up the database info and attachment files per user. I mean, every one of this backups should be able to regenerate all the information and files for one single user.
My questions are:
Is this possible? What's the best way to do it? And, if it's impossible or a bad idea at all, why is it?
Note: The database is a MySQL one.
Note2: I used Paperclip for the users uploads.
|
What's the best way of backing up a rails app data?
|
For this an other reasons we decided to collocate our application with the database, so this problem became moot.
|
Our Java server application logs data to a SQL database, which may or may not be on the same machine. Currently we use MS SQL Server, and we're now porting to MySQL. A user configures database backup parameters on our app server, e.g. time of day to run a backup, and the app server executes SQL Server's BACKUP DATABASE command at the appropriate time, via a sproc. It does incremental backups daily and full backups weekly.
MySQL lacks an equivalent feature to tell the database from a client connection to back itself up. Options we're considering are:
Create a UDF to shell out to mysqldump (or copy database files), which can be called from our app server via a sproc. Essentially we'd be implementing a version of BACKUP DATABASE for MySQL.
Create a service to run on the MySQL box that can get the backup settings from the app server and run mysqldump (or file copy) locally.
Create a backup sproc to mimic mysqldump, e.g. SHOW CREATE TABLES and SELECT INTO OUTFILE for each table.
Setting up a cron job, Perl script, third-party app or other tricks that'd work great in a data center aren't preferred; this is a shrink-wrap package that needs to be pretty robust and hands off.
Database sizes can range from roughly 10MB to 10GB.
I'm aware of the binary logs for the incremental piece. I figure the general solution will probably apply to them as well, if we decide to use them.
This is all on Windows 2003 32-bit or 2008R2 64-bit, MySQL 5.1.
The UDF option seems the best to me. The UDF Repository (http://www.mysqludf.org/) has mysqludf_sys, which may be all we need, but I thought I'd ask for opinions since after extensive googling it doesn't seem like others have reached the same conclusion, or maybe our needs are just out of the ordinary. Our app is the only thing in MySQL, so I'm not worried about other users having access to our UDF.
Any solutions I'm overlooking? Any experience with using UDFs in such a way?
Thanks,
Eric
|
Remote backup of MySQL database
|
0
Check https://www.dropbox.com/ is's very nice, unfortunately you have to pay, if you need more than 2GB.
Share
Improve this answer
Follow
answered Aug 31, 2010 at 12:01
Maciej KucharzMaciej Kucharz
1,41511 gold badge1414 silver badges1717 bronze badges
Add a comment
|
|
How to automatic backup a folder or multiple network folder to another computer.
We try always sync.
can any other software is better?
please suggest.
|
Auto backup of a folder over a network?
|
0
You could use PowerShell to achieve a rather compelx backup procedure. Or you could write the script in Python or another executable scripting language.
PowerShell can be convoluted at first, but it's basically Batch Scripts on crack with access to the .NET Runtime.
Note: PowerShell I believe is only availiable on 2003, Vista, Windows 7 (I know it comes with them, at least, it may work with XP/2000, but im not sure). Python can be installed on prettymuch anything.
Share
Improve this answer
Follow
answered Aug 13, 2010 at 23:53
ArenAren
55.2k99 gold badges6868 silver badges101101 bronze badges
Add a comment
|
|
I'm creating a backup strategy for a sharepoint server I'm setting up.
Have got a backup running daily.
In the long term I'd like to keep:
* Daily backups for the last week.
* Weekly backups for the last month.
* Monthly backups for the last year.
* Yearly backups.
If I was writing in bash/cygwin I would find it fairly easy to write a script to purge backups that are not required by this strategy. However I would rather not have to install cygwin, I'd rather do it with native DOS scripting, or some other specialized tool.
My DOS scripting skills are very primitive, so I was wondering if anyone else had a similar script/util I could use.
Cheers!
|
Script or util to remove old backups
|
0
You're going to have to transfer this account differently. This procedure should work for you, but will require some more sysadmin expertise. As a summary, you're going to make a cPanel backup minus the home directory (generally the largest folder), restore that backup, and then rsync the home directory over to the server.
On the old server run: /scripts/pkgacct --skiphomedir <username here>
Restore the cPanel backup on the new server.
On the old server, run the following command: rsync -zaHlv -e ssh /home/<username here>/* <username here>@<new server here>:/home/<username here>/
This will take time, but will get the entire home directory moved over.
Share
Improve this answer
Follow
answered Jul 30, 2010 at 23:27
sholsappsholsapp
15.8k1010 gold badges5050 silver badges6969 bronze badges
Add a comment
|
|
I am not a server person and I am having issues with this particular transfer. I have transferred accounts in the past with success.
With this particular transfer, I always come across this error:
ERROR: tar of split archive ran out of space
I increased the quota on the old server for this account but that did not correct the issue. Additionally, one of the disks on the old server is 69% full. How would I go about freeing up some memory?
I'd like to avoid having to do this manually as the account I am trying to transfer is very large.
|
Transfer account from one server to another server using cPanel
|
0
The important point is we want to
carry the backup with us from the
office to home every day.
This is on the level of "do we want to go to work or not". Backups that are not stored external ARE NOT BACKUPS. Ever heard of buildings going down? Burning out?
You NEED backups that are at least far enough to survive a larger fire.
Both scnearios are feasible. Tapes have more / larger capacity if you grow.
Also remember soonish we get... writeable 100gb Blue Ray discs ;)
Share
Improve this answer
Follow
answered Jul 23, 2010 at 11:42
TomTomTomTom
1
Add a comment
|
|
We are an online business. Currently we are using DVD's for our backups. The problem is we are running out of space.
I guess there are two alternatives here:
external hard disk drives
tape drives
The important point is we want to carry the backup with us from the office to home every day.
Which alternative do you think would suit best our needs?
|
Backups for online businesses - better external hard drives or tape drives
|
0
I think I've found an answer: I made a transport set. Transport sets are documented in the Portal Administrator's guide http://download.oracle.com/docs/cd/B14099_19/portal.1014/b19305/cg_imex.htm#i1030999.
Share
Improve this answer
Follow
answered Jun 30, 2010 at 5:32
C.ComanC.Coman
1
Add a comment
|
|
I am new to Oracle and I've developed an Oracle Portal using OracleAS Portal 10g release 2 (10.1.4). I need to move the application on another computer, and I was thinking of copying it on a CD but no matter I've tried I wasn't able to find the portal pages, nor the images or documents that I've included.
Can anybody help me please?
Thank you!
|
Backup for OracleAS Portal
|
Disclaimer: My experience is with git rather than hg, but as I understand it the concepts apply equally to both systems.
An advantage of backing up to a remote repo is that if your local repo becomes corrupted (perhaps due to a problem with the underlying filesystem), that corruption does not get transferred over to the backup, unless the files in your working tree themselves are corrupted.
For example, it's possible for some of the objects in the repository, perhaps those which are rarely accessed because you don't change them, to become corrupted. It could be months before you use one of those files again, and so months before you notice (though I think doing a garbage collect run, eg git gc, will detect corruption).
So if you are backing up by pushing commits, you're creating an independent version of those objects, and using checksums (ie the commit hash) to verify the transfer of any new files. Whereas if you are backing up to a backup provider, you're duplicating the actual objects in the repo, in whatever state they are in, and duplicating any changes to those files, including corruption of them.
Usually backup providers will give you rollback (spideroak seems to be particularly good for this) but you'll still have to sift through a lot of versions to figure out when the corruption happened; also with some providers, the rollback period is limited (especially for free accounts).
|
I'm currently signed up with a third party service that hosts my mercurial repositories as a central hub to push my changes to as a sort of backup.
Now, I'm looking at a system to backup my laptop and am concidering Mozy. I'm a loan developer, and work on a laptop and am usualy connected to my internet via wifi with my laptop only really being on when I'm working, so feel something like Mozy is my best option.
My question is, if I'm the only developer, could I get away with just using local mercurial repos and using Mozy to backup everything up? Rather than pushing to an external repo?
Many thanks
Matt
|
Can I use "Online Backup" to backup my DVS instead of pushing to an external repo?
|
Before anyone can answer your question, you need to specify:
1) How did you take your snapshot. ie, what form is it in, .sql or raw binary; what is the filename, etc.
2) what version of mysql are you running?
Given that you have a text file dump of the database, and you want to retrieve a subset of it, you may want to load up the snapshot into another database (preferably on another machine), and export only the data of interest. Then you can import that into your production database, using any number of tools, such as the mysql command line, or MySqlAdministrator.
for more information, see http://dev.mysql.com/doc/refman/5.5/en/backup-and-recovery.html
(change the 5.5 to the version of your particular installation for more exact instructions, if necessary)
|
i have a snapshot of the database that I took before i did some data migration. The data migration messed up, so I want to retrieve the data from the snapshot to migrate over to the production database.
Question: How do i access the snapshot, and how do i retrieve the available data?
|
MYSQL: Copying a table from a snapshot into production
|
0
This is probably due to php.ini settings
For instance max_execution_time needs to be high enough to upload such a large file. Im not sure if any other ini directives are pertinent .
Share
Improve this answer
Follow
answered May 7, 2010 at 7:21
GalenGalen
30k99 gold badges7272 silver badges8989 bronze badges
Add a comment
|
|
I have a php script to backup automatically cPanel and upload the .tar.gz concurrently to an FTP server.
The script works fine.
The script requests the file then starts the ftp transfer and when the upload is complete it send also a confirmation email.
The only problem is that even if the backup is, for example, of 1GB the uploaded file is only of 170mb.
It seems it's not able to upload large files. Infact with a small backup (for example 16mb or 20mb) it's all working fine.
You can see the complete file here http://pastie.org/949680
|
Backup cPanel with PHP and upload to another FTP
|
0
I'm pretty sure no RDBMS can be configured to do this, and I doubt they ever will. First, SimpleDb is a quasi-competitor to commercial databases so it wouldn't make sense for most companies to build this. Second, it would have a huge negative effect on transaction performance since writing transactions to SimpleDB would probably take at least 100 times as long as writing them to disk.
Share
Improve this answer
Follow
answered May 6, 2010 at 16:24
Ashley TateAshley Tate
59555 silver badges1616 bronze badges
2
Hm, interesting. I thought, because of the performance you have mentioned, the competition is not so direct and the other features of SimpleDB (data redundancy) will make it as compelling central backup solution (at least we have such a business need).
– István
May 7, 2010 at 6:13
Its a backup, and SimpleDB has no locks I don't see a transaction hit, just add them to the todo list, you can throw multiple threads at it, recording transactions with an ID. Storage is cheap, too.
– Tom Andersen
May 19, 2010 at 13:48
Add a comment
|
|
Is there any solution (RDBMS or NoSQL) that can use SimpleDB as a backup?
Thanks,
A.
|
SimpleDB as backup?
|
0
Use an ETL tool, something like Pentaho Data Integration or Talend. You'll still have to think about how to identify the data you want to retrieve, but you get at least the following (from the PDI feature list):
Rich transformation library with over 100 out-of-the-box mapping objects
Broad data source support including packaged applications, over 30 open source and proprietary database platforms, flat files, Excel documents, and more
Advanced data warehousing support for Slowly Changing and Junk Dimensions
Proven enterprise-class performance and scalability
Integration (EII), advanced scheduling, and process integration
Unified ETL, modeling and visualization development environment for design of BI applications
It basically comes down to "it will work with whatever your data sources are, it will be robust and you'll be able to apply the expertise gained elsewhere".
Share
Improve this answer
Follow
edited Apr 7, 2010 at 12:39
answered Apr 7, 2010 at 12:34
Tomislav Nakic-AlfirevicTomislav Nakic-Alfirevic
10.1k55 gold badges3939 silver badges5151 bronze badges
Add a comment
|
|
I am developing application to be run in central server and distributed computers.
I am supposed to write application to backup the data from distributed machines and merge it in central server. I thought of compressing whole local database and sending it to server for merging. But as the database size grows the size of compress file also began to grow. So is there any way to merge data in central server without sending whole database. I need to do it on daily basis.
Daily take backup and send to server
|
Marking Changes to database
|
0
You should try SQLyog -- MySQL GUI. It's sja component makes scheduling backups/synchronization pretty smooth for users, and it also available in a Community Edition.
Disclaimer:
I work for Webyog, Inc -- The publishers of SQLyog.
Edit: I'd also like to add that I find it very reliable and fail-safe and use to manage production servers.
Share
Improve this answer
Follow
answered Apr 2, 2010 at 16:49
themoondothshinethemoondothshine
3,00155 gold badges2525 silver badges3434 bronze badges
0
Add a comment
|
|
I am working on nightly and hourly backups of MySQL Databases. There are multiple MySQL databases which are either InnoDB or MyISAM (Note: Each database is either InnoDB or MyISAM for a reason). With the 2 different types I want to make sure I am grabbing everything that is needed for backup and recovery. Here is my current plan
Nightly
-mysqldump of each DB which is stored locally and remotely.
Hourly
-flush binary logs and store them locally and remotely.
Weekly
-expire binary logs older than a week.
I feel like I am grabbing everything that is needed for the MyISAM databases but I am concerned about the InnoDB databases and the log files (ib_logfile0, ib_logfile1, ibdata1) they create. Should I backup these files? Nightly? Hourly? Both? Do I really need them if I am already doing the above nightly and hourly backups?
|
Backup & recovery of multiple MySQL databases (InnoDB & MyISAM)
|
The High Availability component of Exchange 2010 uses some parts of the streaming backup API under the covers - for doing reseeds for example. The full API isn't supported though.
|
Exchange 2010 does not support the ESE API for doing backups like it did in 2003 and 2007 according to MSDN. I Quote: "Exchange 2010 no longer supports the ESE streaming APIs for backup and restore of program files or data. Instead, Exchange 2010 supports only VSS-based backups."
So my question is, if this is the case, why is the DLL (ESEBCLI2.DLL) still shipped with exchange 2010? I found it under C:\Program Files\Microsoft\Exchange Server\V14\Bin. Am I missing something here?
|
Exchange 2010 and ESE Backup API
|
Ok, I found the solution. It is actually pretty simple if you know a bit more about Vista then I do.
The backup folder is configured the same way as under XP, but the administrator permissions are not available until the process is in elevated mode. To get there, I just needed to add a manifest to the executable which requires the administrator privileges:
<requestedExecutionLevel level="requireAdministrator" uiAccess="false" />
Now I get an UAC when calling the application and everything works fine.
|
With our product we have a simple backup tool for the sql server database. This tool should just make a full backup and restore to and from any folder.
Of course, the user (usually an administrator) needs permission to write to the target folder.
To avoid the problem of not being able to perform a backup to a network drive, I write the backup to a temp file in the Sql Server backup directory. Then I move it to the target folder. This requires permission to delete the temporary file from the sql servers backup folder. Restore is the same in the other direction.
This seemed to work fine until someone tested it on vista, where the user does not have write access to the backup folder by default.
So there are many solutions to solve this, but none of them seemed to be really nice.
One solution would be to find another folder for the temporary file. Both the sql server user as well as the administrator performing the backup need read and write permissions. Is there such a directory?
Any other ideas? Thanks a lot.
Edit: Solution must work with Sql Server 2005 and 2008, C# 3.0 (Smo), Windows XP and Vista.
|
Sql Server Backup and move backup file: How to cope with file permissions?
|
0
I use http://s3tools.org/s3tools and the tool S3cmd in a cron job to upload all the stuff I want to back up. I know of people that use SVN and then just back that up.
Share
Improve this answer
Follow
answered Mar 17, 2010 at 13:35
DidiDidi
1133 bronze badges
Add a comment
|
|
I've done a far bit of research on this via Google and there seems to be quite a few ways of possibly doing this.
I'm looking to incrementally backup new and updated files in two directories on my Plesk run Centos 5.2 server: /backups and /var/www/vhosts (preferable only httdocs within each vhost)
Has anyone got some great feedback from using the various solutions - seems to be various Java, Perl and Ruby based solutions out there.
Many thanks,
Chris
|
Automated incremental backups from Plesk on Centos to Amazon S3
|
I don't know bluehost, but definitely put it outside the web root if you can. If you can't, create a new directory, put the shell script and a .htaccess file saying
Deny from all
in it. That should block it for access from outside with a 403 error - make sure you test.
Putting it outside the web root is the way better option IMO, especially if it contains sensitive data.
|
After having seriously database loosing problems, I decided to go for an autobackup system. I researched and AutoMySQLBackup looks fine.
I use Bluehost, where should I locate the automysqlbackup.sh file? I just worry about security issue. Also, where should we locate the backup files?
Appreciate! thanks a lot!
|
Where to locate automysqlbackup.sh [AutoMySQLBackup] at Bluehost? [secure issue]
|
0
Just store your hashes of your chunks somewhere. You don't need the "backup" copy to compare to if you know what your hashes are. Obviously this creates a chicken and egg problem for at least one hash, but copying a single "chunk" is a much smaller problem.
Your proposed approach will still have performance problems though, as hashing a large file isn't going to be a pretty operation on a slow CPU powered by a battery.
I assume you don't have the granular control to keep track of the parts of the file you modify, and then update just those sections when you need to do backup?
Share
Improve this answer
Follow
answered Dec 10, 2009 at 1:15
community wiki
Paul McMillan
3
Nope.. This is perhaps the least organized database that I've ever seen (where not done on purpose). Primary keys are strings (ugg) and very little is normalized. There is no datetime stamp on anything. It is just ugly. The problem with the hash idea (and why I've come here) is what you've said, and if the file changes just a bit in the front and moves all the data by an offset of just one, then the hashes will not match up and all performance gains are gone.
– baash05
Dec 10, 2009 at 23:23
If you've got reasonable control over the system (I know nothing about winmo development) I'd say your best bet might be to take a look at third party tools that do this sort of differential backup. It's a hard problem to solve from scratch. You might look at DAR for inspiration. dar.linux.free.fr
– Paul McMillan
Dec 10, 2009 at 23:45
I think tar and rsync also implement some form of differential/incremental file copying/backup.
– Paul McMillan
Dec 10, 2009 at 23:47
Add a comment
|
|
On a windows mobile unit, the software I'm working on relies on a sdf file as it's database.
The platform that the software is targeted towards is "less than optimal" and hard resets every once and a while. In the far distant past we lost data. Now we close the database, and copy the SDF file to the SD card. If the unit gets hard reset, we restore the app (also on the sd card) and the database.
I'm not concerned about the restore (just yet). The problem we have now is that doing a "backup" takes a crazy amount of time because the SDF is 7+ megs and writing to the SD card is slow slow slow.
My boss suggested we create hashes of "chunks" of the file and then write to the destination file only when a compare of the hashes is !=.
So here's the question.
How would you test if a file is changed if you can only have one copy of the file and thus can't compare it with it's original.
I'm just shooting for a bit of brain storming.
|
Backing up my database is taking too long
|
Sub Main()
Dim objAuthApp As Object
Dim objAuthDoc As Object
Dim objLAEConfig As Object
Dim objDSConfig As Object
Dim laef
laef = "\\backupsserver\backups\LAEBackup-" & Year(Now) & "-" & Month(Now) & "-" & Day(Now) & ".lae"
objAuthApp = CreateObject("Authenticator2.Application")
objLAEConfig = objAuthApp.LAEConfigurations.Add(laef)
objDSConfig = objAuthApp.DSConfigurations.Add("servername", 389, "o=cognos,dc=com", 0)
objDSConfig.DefaultSecuritySource = True
objAuthApp.Export("Default", "Administrator", "password", True, False)
objAuthApp.Quit()
objDSConfig = Nothing
objLAEConfig = Nothing
objAuthDoc = Nothing
objAuthApp = Nothing
End Sub
|
Within Cognos 7.4 security.. one would create an LAE file to export all their users...
directions here... http://www.cognos-install.co.uk/articles/backups/access_manager_export_to_lae.asp
Now you'll notice at the bottom "Finally, it is possible to build an automated process for this task, however this is outside of the scope of this document. If you feel that an automated process is something that is important to your organization then this can be achieved using Technical Consultants with Cognos Planning expertise."
Does anyone have a batch script... or command line for Access Manager.. for exporting an LAE file? I would like to have automated backups of my users incase of a disaster. This cannot be the first request for this.
TIA,
Kirby
|
Cognos LAE Backup Automation (Batch File?)
|
There are many solutions depending on what you mean by periodically and the use of the SQL Server 2008 and SQL Server 2005 databases. Assuming that the SQL Express 2005 client database is an application that controls its own data but eventually needs to merge those changes into the SQL Server 2008 database, then Merge Replication probably would be a good solution. The SQL Server 2008 database would be the Publisher and the SQL Express 2005 would be the subscriber (SQL Express can only be a subscriber).
|
I am developing an application which needs to backup data between SQL Express 2005 and SQL Server 2008. My client runs with an installation of SQL express 2005 and needs to periodically back up data to a server database running on SQL Server 2008. The client db also receives some new data from server and needs to update itself. The question is HOW DO I DO IT ? Please help.
|
Backing up data between SQL Express 2005 and SQL Server 2008 Standard Edition
|
0
You can do this with FogBugz on Demand. Here is a scrape right off the "Your account" screen:
Download Your Database:
You can download your FogBugz database in
three different formats. The FogBugz
database format (schema) is documented
here.
SQL Server 2000
Access – This format is temporarily unavailable
MySQL
Share
Improve this answer
Follow
answered Nov 23, 2009 at 22:52
JohnFxJohnFx
34.7k1818 gold badges106106 silver badges165165 bronze badges
Add a comment
|
|
I'd like to use a hosted issue-tracking service, but I want to be able
to backup my data in case the service goes under.
Are there any hosted issue-tracking services that have a "data
liberation" strategy and support you periodically copying your
issue-tracking data to your local system?
If not, does anyone have a clever workaround for backing up your
data using an existing hosted issue-tracking service, even if it's in
an unsupported way?
|
How to backup your data with hosted issue tracking?
|
0
Can't you have the ASP.NET page read the file from the backup location and stream the file to you?
Of course, you might have to talk to your host about this. They could be explicitly blocking your web site from accessing the SQL Server folders. Why won't they just provide a backup for you? The hosts I've used for shared SQL Server hosting have no problems doing this.
Share
Improve this answer
Follow
answered Oct 18, 2009 at 16:23
Aaron BertrandAaron Bertrand
276k3838 gold badges473473 silver badges502502 bronze badges
2
I don't know if IIS will have the permissions to access SQL server folders but I'll try. Well, its not for me, my employer needs a web interface to do it instead of the hosting management UI
– user179200
Oct 18, 2009 at 16:27
Well like I said, you might have to talk to the web host to make that happen.
– Aaron Bertrand
Oct 18, 2009 at 21:11
Add a comment
|
|
I'm trying to create an asp .net page to backup and download an mssql db from my remote host on a shared server.
I tried using SMO to create a backup and save it the same folder as the .aspx page but it throws an access denied exception, I later found out that the sql server doesn't have permissions to write backups anywhere but its local backup folder which I can't download from.
Is there any alternative method to get this working?
|
asp .net page to backup sql db
|
0
In case you want to do something more space-efficient than cp --archive, you might want to look at faubackup.
Share
Improve this answer
Follow
answered Oct 14, 2009 at 13:18
TeddyTeddy
6,08333 gold badges2828 silver badges3838 bronze badges
1
rsync is much better for this as it first cheksums the files so it's more resource effective.
– Zsolt Botykai
Oct 16, 2009 at 10:09
Add a comment
|
|
I need to backup (for debugging) some temporary files meanwhile a program running. I used to do it with rsync-ing the /var/tmp/someprogram directory with find . -iname 'blahblah' -exec rsync -someoptions $DESTdir, which works wonderful.
Except the case of someone in some program (for which I don't have source code access, and never will have - sad, and long story starting with COBOL...) overwrites the temporary file with new content from within the some program.
E.g.
Starting PROG program
This creates a B1237 file in the tmp dir (and I'm rsyncing in every few minutes)
After the PROG finishes the processing of B123 it starts another cycle and recreates B1237 with new content, and the rsync-ed file will be overwritten. And I need the first version as well.
Is it possible in an easy way? I was thinking about timestapped gzipping of the DEST dir, but there should be another way (without creating a (e.g.) git repo in the dest dir, and commiting after syncing...
Update:
I did not mention - sorry - that I don't have the rights to install anything on this SLES9 (corporate) server.
Thanks in advance!
|
Rsync create new file on file size change
|
You can't restore the tablespace into an existing database; however, you can restore only the system catalog tablespace (SYSCATSPACE), a temporary tablespace (like TEMPSPACE1) and the tablespace you want into a new database.
|
In DB2 8.2 for LUW can you restore one tablespace to a 2nd server from a full database backup without having to restore the entire database? The database backup was taken when the system was up so log files are involved. I can provide further clarification if needed.
Thanks,
Dave
|
DB2 - LUW 8.2 Can you restore one tablespace from a full DB backup to a different server
|
How to Script User and Role Object Permissions in SQL Server
Might also be of interest:
How to transfer logins and passwords between instances of SQL Server
|
I would like to know if there is a query to export the user rights select insert grant and so on, to a SQL file.
I managed to get the list of the Grants:
Select * from INFORMATION_SCHEMA.TABLE_PRIVILEGES
|
Export user rights into a SQL file for SQL Server
|
0
I've used the Microsoft Synchronization Framework to create my own Backup & Restore application to migrate user profiles and other related files. The Sync Framework at least has the option to fail silently and continue without prompting when a file is in use, which is the behavior I wanted.
Since it is a synchronization framework it has some ability related to Conflict management. I didn't need them, but it also has other 'providers' for Database and Web synchronization, in addition to the File Sync.
Maybe running the program remotely using psexec can reduce the number of files in use. In my case I only cared about the "current" user so the program was run by the local user and backed up files to a Network Share.
The Sync Framework did require an MSI installation, but I did that with PSExec too.
Share
Improve this answer
Follow
answered Jun 3, 2015 at 22:29
JamesJames
33011 silver badge1212 bronze badges
Add a comment
|
|
--BEGIN RANT--
ALL I want to do is copy the "Documents and Settings" folder to back it up in Windows Server 2003, so we can grab old files from it, as needed, easily, after I wipe the server to upgrade the OS to Windows Server 2008.
BUT, I get errors about NTUSER being in use, etc., when I try to copy it.
It's VERY irritating that an administrator can't just use Windows explorer to copy folders, but what-ever.
I don't have time to jump through security hoops, wiping out permissions.
I don't have time to initiate a HUGE folder copy only to have it fail some random % through the operation when it runs into an open file handle.
I don't have time to repeat this process over and over, hunting down the open file handle and closing it with process explorer each time, destabilizing the system as a side effect.
--END RANT--
I understand that a user-mode process can have some kind of flag that allows it to bypass NTFS security.
How do I give this "backup" role (SE_BACKUP_NAME/SE_RESTORE_NAME) to a process, such as a C# program?
Can it be done in managed code (or am I gonna have to dig up Win32 API docs, or hand-write manifests and embed them who-knows-where, or track down some obscure backup-program-registration command-line-utility)?
How do I open a file in C# with backup semantics, so I can copy the file?
Will open file handles (with exclusive read-locks) interfere with reading the file for backup purposes, or will opening it with backup semantics take care of that, and temporarily suspend writing to the file while it's opened for backup?
I'm asking here, because it's useful information to remain on-record at stackoverflow, that someone probably can spout off the top of their head, that I don't have time to look up myself. Thank you.
Soon enough, I'll be upgrading some servers to the nightmare that is Windows Server 2008 and learning the icacls command utility, thanks to Microsoft removing GUI support for editing security on multiple folders/files (smart move. not.)
|
Writing a file backup utility in C#
|
To copy to another Sql Server? If so, what you are after is transaction log shipping
The transaction log backup captures the log at a certain point in time, however the log is still required to give the complete picture of the data in the database so it can't be moved. If you want to move it, stop any applications that are accessing the database's data, take the database offline, detach it, move the files and re-attach it.
|
Is it possible copy the physical Transaction log file for SQL 2005?
Currently our databases are backed up every night with hourly Transaction logs taken during the day. Is it possible that after the hourly backup has been done that the physical log file can be copied to another server? what steps would need to be taken to get this to work
Thanks
Andrew
|
Backup up Transaction logs
|
0
Take a look at DeltaCopy - rsync for Windows. It's free, and does a great job.
Share
Improve this answer
Follow
answered Sep 14, 2009 at 1:09
DruidDruid
6,44344 gold badges4242 silver badges5757 bronze badges
Add a comment
|
|
I'm from a linux background, and need to run an incremental backup script on windows. I already have a batch script which dumps my database into a file. What I'd like is to only keep backups for the last seven days in addition to one backup file per week for the last 4 weeks (for example). I presume it's possible to do something like this with the windows task scheduler and a clever batch script?
Thanks.
|
Windows incremental backup script
|
0
I think an open Writer should be okay, but you definitely can't copy while another thread is modifying the index, or you may get a FileNotFoundException. From the source:
/**
* Copy contents of a directory src to a directory dest.
* If a file in src already exists in dest then the
* one in dest will be blindly overwritten.
*
* <p><b>NOTE:</b> the source directory cannot change
* while this method is running. Otherwise the results
* are undefined and you could easily hit a
* FileNotFoundException.
*
* @param src source directory
* @param dest destination directory
* @param closeDirSrc if <code>true</code>, call {@link #close()} method on source directory
* @throws IOException
*/
public static void copy(Directory src, Directory dest, boolean closeDirSrc) throws IOException {
Share
Improve this answer
Follow
answered Jul 16, 2009 at 16:28
bajafresh4lifebajafresh4life
12.6k55 gold badges3838 silver badges4646 bronze badges
Add a comment
|
|
I am thinking of adding JMX bean for taking hot backup of lucene index.
LuceneMBean mbean = new LuceneMBeanImpl();
ObjectName name = new ObjectName("indexing.index:type=lucene");
MBeanServer mbs = ManagementFactory.getPlatformMBeanServer();
mbs.registerMBean(mbean, name);
LuceneMBean will have method called backupIndex(String directory).
I have gone through lucene docs and I found out copy() method of Directory. If I have Writer Open on directory will this method will work? Basically my code snippet is like follows :
public class LuceneMBeanImpl implements LuceneMBean{
public void backupIndex(String directory){
Directory fileDirectory = FSDirectory.getDirectory(directory);
Directory.copy(masterDirectory, fileDirectory,false);
}
}
|
How to Take Taking online Hot Backup of lucene FileDirecotry index?
|
Ntbackup, the backup program included in previous versions of Windows Server and NT-based Windows client operating systems, is no longer supplied or supported, but you still have some choices.
Use Windows Server Backup to back up full hard drive partitions. You will need to configure your system to store data on a separate partition from the boot partition in order to keep from backing up the operating system. (Note: You must back up to a partition on a different disk or to a network share.)
Use Robocopy. This is a command-line tool that replaces xcopy. It has numerous options that allow you to back up via file-copying and set the archive bit according to what type of backup job you want to achieve.
Purchase a third-party backup tool. Microsoft Data Protection Manager and Symantec BackupExec are two enterprise-capable (and enterprise-priced) options.
|
What are my options if I want to just backup my data directories on Windows 2008 Server. It looks to me like the backup software built into the OS only backs up an entire hard drive.
Is my only option to copy the files?
|
Windows 2008 Server Backup
|
0
I am facing the same issue too. I'm working with SAP Hana running on a SUSE 15sp1 Azure VM, with network connectivity established by the use of NSG service tags as from https://learn.microsoft.com/en-us/azure/backup/backup-azure-sap-hana-database.
Looking inside the script I noticed that a curl request is used to run connectivity checks
curl --silent --output /dev/null --location --request "HEAD" "https://login.windows.net/" --write-out "%{http_code}\n"
If I run curl manually, changing the options in order to make it display the whole of the command output, it returns the following:
Warning: Setting custom HTTP method to HEAD with -X/--request may not work the
Warning: way you want. Consider using -I/--head instead.
curl: (56) OpenSSL SSL_read: SSL_ERROR_SYSCALL, errno 104
This error is reported as a closed issue on the curl Github repo (https://github.com/curl/curl/issues/4409). It looks like if it would be solved by subsequent curl release.
Mine is 7.60.0. Latest is 7.73.0. Unfortunately 7.60 seems to be the latest available in SUSE repos and I do not want to intall not officially approved packages on a production system.
So in my case the only way out would be to run the script with the --skip-network-checks parameter.
Share
Improve this answer
Follow
answered Oct 21, 2020 at 17:26
mauloopmauloop
1
Add a comment
|
|
I am using the SAP Hana 2.0 & trying to implement the SAP HANA backup using the Azure Recovery vault.
For that, I have to run the file "msawb-plugin-config-com-sap-hana.sh". So it was showing the error as Failed to connect AAD1 service.
Anyone knows the issue. please resolve.
|
Failed to connect to 'AAD1' service in SAP HANA Backup using Azure site recovery
|
-1
To see how much you currently have,
1, Open the Settings app, select “iCloud”, and find where it says “Storage” to view how much space you have left in your iCloud account.
2, You can also view your available iCloud storage space by navigating to “General” > “Storage & iCloud Usage”.
Share
Improve this answer
Follow
answered Feb 19, 2020 at 11:47
AnanthakrishnaAnanthakrishna
51755 silver badges2525 bronze badges
1
I think you do not answer the question, he probably asks how to do it in code, not how to navigate in the iOS.
– peetadelic
Jul 6, 2023 at 6:27
Add a comment
|
|
I want to upload my app data on iCloud but my iCloud storage has no space.
How can I check for that so I can show alert before start uploading?
|
How to check iCloud free space
|
-1
Check connectivity over pbx.
You can do that with bptestconn -v {hostname}.
Ensure pbx is running. Verify with bpps -x.
Share
Improve this answer
Follow
edited Jun 2, 2021 at 8:34
Oleksii Filonenko
1,60311 gold badge1717 silver badges2727 bronze badges
answered May 28, 2021 at 19:12
Jules WinnfieldJules Winnfield
1
1
Please elaborate a little bit more on your answer to help future users.
– Buddy Bob
Jun 2, 2021 at 0:54
Add a comment
|
|
There was one problem, we use the system for backup of symantec netbackup version 8.1.1
There is a task to reinstall the master server and restore all data. I created a catalog policy, everything worked out with a Bang, the system was demolished and re-installed, the directory was restored successfully, but, the media server and all clients can not connect to the new master, the error "The vnetd proxy encountered an error"
What could be the problem?
screenshots:
ibb.co/GCNgd2q
ibb.co/V9Gp3QL
ibb.co/dLZhxHZ
|
Symantec netbackup 8.1
|
Use the --result-file option of mysqldump to create the dump file which you may then send to the browser for download. The full command would be (insert your file names as desired):
exec('mysqldump --user=backup --password="yourpasswordhere#" --routines --insert-ignore --complete-insert --force --result-file=name_of_your_dump_file_here.sql --databases ' . $yourdatabasename);
|
I am coding a php script for backup a database, the idea is to use mysqldump command and then download a sql file with the result, unfortunately the output of the mysqldump is not save on the file, just the source code. This is the code:
$archivo = 'bkbiblioteca_' . date("d-m-Y_H:i:s") . '.sql';
$comando = "mysqldump --add-drop-table --host=$servidor --user=$usuario --password=$clave $base > $archivo";
try{
$archivo_manejador = fopen($archivo, 'w+');
fwrite($archivo_manejador, $comando);
fclose($archivo_manejador);
header('Content-Description: File Transfer');
header('Content-Type: application/octet-stream');
header('Content-Disposition: attachment; filename=' . basename($archivo));
header('Content-Transfer-Encoding: binary');
header('Expires: 0');
header('Cache-Control: must-revalidate');
header('Pragma: public');
header('Content-Length: ' . filesize($archivo));
ob_clean();
flush();
readfile($archivo);
unlink($archivo);
}catch(\Exception $e){
echo 'error en el proceso de bk' . $e->getMessage(); //TODO: favor incluir en la bitacora global del sistema
}//fin de catch
Right now I am getting the file with this line:
mysqldump --add-drop-table --host=$servidor --user=$usuario --password=$clave $base > $archivo
I have tried with this command:
system("$comando");
But it does not work, so I can't not figure out how to execute the mysqldump command and obtain the result in the file instead of get the source code line.
Thanks in advance
|
mysqldump command is not executed just it is saved on a file
|
-1
if u are deleting or insert simply add another line of code to perform the task here is and example
<?php
include_once '../dbconnect.php';
$error = false;
if ( isset($_POST['btn-signup']) ) {
// clean user inputs to prevent sql injections
$emails = trim($_POST['emails']);
$emails = strip_tags($emails);
$emails = htmlspecialchars($emails);
//basic emails validation
if ( !filter_var($emails,FILTER_VALIDATE_EMAIL) ) {
$error = true;
$emailsError = "Please enter valid emails address.";
} else {
// check emails exist or not
$query = "SELECT emails FROM newsletter WHERE emails='$emails'";
$result = mysql_query($query);
$count = mysql_num_rows($result);
if($count!=0){
$error = true;
$emailsError = "Provided emails is already in use.";
}
}
// if there's no error, continue to signup
if( !$error ) {
$query = "INSERT INTO newsletter(emails) VALUES('$emails')";
$res = mysql_query($query);
if ($res) {
$errTyp = "success";
$errMSG = "Successfully registered, you may login now";
unset($emails);
} else {
$errTyp = "danger";
$errMSG = "Something went wrong, try again later...";
}
}
}
?>
Share
Improve this answer
Follow
answered Sep 21, 2016 at 12:12
CloudcrownCloudcrown
111 bronze badge
1
There are dozens of tables in my database , Many tables has foreign keys that has 'cascade' property on update/delete , So If I delete row of table that has primary key column , It'll delete many rows from tables that foreign key are referering to it's primary key
– sumit1024
Sep 21, 2016 at 12:18
Add a comment
|
|
I want to create a trigger that will insert delete data into a backup table , I have wrote this code , but it's not working. MYSQL is throwing syntax error when i create this trigger . How can I get expected result ? Any Help would be appreciated.
BEGIN
IF EXISTS (SELECT *
FROM information_schema.tables
WHERE table_schema = 'jobportal'
AND table_name = 'dlt_jobs'
LIMIT 1) THEN
create table dlt_jobs (select *,now() as deleted_on from jobs where job_id=OLD.job_id) ;
ELSE
insert into dlt_jobs (username) values ('something');
END IF;
END
|
Trigger that will insert deleted data into table
|
0
It's not possible. There's no such thing similar to Glacier Vault Lock. You can temporary suspend a user though.
You can temporarily block a user's access to your organization's
Google services by suspending their account. This doesn't delete their
email, documents, calendars, or other data. And their shared documents
remain accessible to collaborators. But the user can no longer sign in
to their account. Also, new email and calendar invitations are
blocked. After suspending a user, you can restore the account at any
time.
https://support.google.com/a/answer/33312
Share
Improve this answer
Follow
edited Aug 26, 2016 at 11:58
answered Aug 25, 2016 at 17:12
Anton ZorinAnton Zorin
37122 silver badges1313 bronze badges
2
Thanks but I'm looking for DELETE permissions, I need that no one can delete objects (even the administrator or someone with admin permissions).
– Hello World
Aug 25, 2016 at 17:47
@HelloWorld, My bad, will update the question. it's not possible since there's no such a thing in Google similar to Glacier Vault lock.
– Anton Zorin
Aug 26, 2016 at 11:52
Add a comment
|
|
I bought CloudBerry Ultimate software (Link for more information) to make backups. On that software I can control when the software deletes the objects I backed up, but I want to be sure that will be impossible to delete files from my Google Storage Nearline, I know that Amazon Web Services have the Amazon Glacier with VaultLock that prevents to delete objects for a period of time and is impossible to delete (even if you have all administrative privileges) any object or modify the parameters.
Does any one know how can I prevent to delete any object from my Google Nearline account?
|
Prevent objects deletion from Google Storage
|
1
You can backup your WordPress either from your hosting account (preferable) or from your WP dashboard.
You need to backup two things - all the files (the root of your Wordpress installation) and the database for your WP installation.
Since you only have access to the dashboard, you have to use plugin for this.
Two of my favorite free backup plugins are:
BackupWordpress - https://wordpress.org/plugins/backupwordpress/
BackWPup - https://wordpress.org/plugins/backwpup/
They are intuitive and easy to work with, so you shouldn't have issues.
Share
Improve this answer
Follow
answered Jun 30, 2016 at 12:33
Nesho SabakovNesho Sabakov
8011 silver badge33 bronze badges
Add a comment
|
|
I have never used wordpress before, My boss has given me access to a site which was created using wordpress. then He asked me how I am going to make sure I don't break the site accidentally, I told him I would create a backup on my local computer so that all my changes can be restored if I mess up.
I have the wordpress dashboard up. How do I back up EVERYTHING, I hear there are two separate things I need to back up? someone please help me.
PS: I don't think he would like me to do this with out the use of additional plugins.
|
How to manually back up wordpress website
|
Its available now (PREVIEW mode).
Its possible to backup the Azure V2 VM's ie Backup VM in new Azure Resource Manager portal
Check this link
https://azure.microsoft.com/en-us/documentation/articles/backup-azure-vms-first-look-arm/
|
Backup of virtual Machine in the Azure new portal.
Backup of Azure V2 VMs
|
Backing up a Azure virtual machine (V2) in Portal through Resource Group
|
0
There are two different backup systems built into DA:
Admin Tools | System Backup. This tool lets you backup configuration data and arbitrary directories, locally or using FTP or SCP.
Admin Tools | Admin Backup/Transfer. This tool is oriented toward backing up data account by account, in one archive per account, in a format that you can use to restore from (in the same tool) on the original or another DA server (i.e. if you want to transfer to a new server). You can back up locally and/or via FTP.
Both options can also be scheduled via cron.
Depending on your level of access, only one of these might be available to you. This page has further info for non-administrators: http://www.site-helper.com/backup.html.
Share
Improve this answer
Follow
answered Sep 15, 2014 at 1:56
szarkaszarka
13555 bronze badges
Add a comment
|
|
I'm sure there's a good amount of developers here that use DirectAdmin and I had a quick question.
I've always used cPanel and I'm not on a server that is using DirectAdmin instead. Where in DirectAdmin can you generate a full backup of the account at the user level?
Also, do DirectAdmin backups include everything related to the account like cPanel backups do? For example, not only the files and databases but also the cron jobs, DNS zones, email accounts, etc.?
And where are the backups stored by default? Is there an option to send the backups to a remote server via FTP like you can with cPanel?
|
How do backups work in DirectAdmin?
|
You will need to create a backup of your site and then restore it.
For Backup
Backup-SPSite -Identity http://Constoso -Path "c:\backup\YourSiteBackup.bak"
For Restore
Restore-SPSite -Identity http://DestinationServer -Path "c:\backup\YourSiteBackup.bak"
But you will need to be careful while while restoring the webparts.
This maybe of some help:
|
I want to clone production environment to development, only farm backup and restore will backup service application and custom webparts ?
|
Can we backup and restore custom webparts and service application to development by taking farm backup?
|
0
According to the MySql 5.6 documentation I looked up, the user and password parameters are both of the form -word=value, and you only have them as -lvalue (where l is a letter). Yours might be an allowed substitute, but it seems worth it to try it the way they give in the documentation, "-user=whoever -password=password".
Since you're on windows, it seems reasonable to format the string for the command on the F drive with windows directory separators instead of unix/mac ones. So I would try using "f:\\backup", etc., instead of what you have there.
As a debugging tool, you might try writing the command out to a file, then copying and pasting it into a DOS window and running it. Don't cheat yourself -- don't type in ANYTHING, take it all from the program. At least that way you can look at the output.
Share
Improve this answer
Follow
answered Apr 8, 2014 at 2:25
arcyarcy
13k1414 gold badges6060 silver badges106106 bronze badges
Add a comment
|
|
I keep getting 'processComplete == 2' error. I am sure someone here will spot my error in seconds.
Here is my code...
String filePath = "F:/backup";
File f1 = new File(filePath);
f1.mkdir();
String bath = "F:/backup" + "/backup.sql";
String executeCmd = "C:\\Program Files\\MySQL\\MySQL Server 5.6\\bin\\mysqldump -u" + username + " -p" + password + " --database " + host + " -r " + bath;
Process runtimeProcess = Runtime.getRuntime().exec(executeCmd);
int processComplete = runtimeProcess.waitFor();
if (processComplete == 0){
System.out.println("Backup complete");
}
else{
System.out.println("Backup failure");
}
I tried debugging in Java. So far everything up to the int processComplete = runtimeProcess.waitFor(); line is correct. However, from the 'if (processComplete == 0){' bit it jumps straight to
else{
System.out.println("Backup failure");
}
I keep getting 'processComplete == 2' error and I would, and I mean greatly appreciate it if someone could tell me where I am going wrong. It creates a backup directory successfully but not a backup.sql file.
|
processComplete == 2 error. mySQL Backup Java
|
-1
You need to make sure that the log on account used for the SQL Server service has access to the shared folder. On the server...
Click Start -> Run
Type services.msc
Scroll down to SQL Server
Right Click -> Properties
Click on "Log On" tab.
Verify that the log on account has access to the shared folder.
Share
Improve this answer
Follow
answered Feb 7, 2014 at 22:15
George MastrosGeorge Mastros
24.3k44 gold badges5252 silver badges6262 bronze badges
Add a comment
|
|
I have a problem with creating SQL Server backups. I would like them to be stored on network share rather than harddrive, but passing network share URN makes them fail. Is there any way that I can do it other than harddrive mapping?
|
SQL Server make backup to different machine
|
-1
Make sure the path exists in your database server and not on your client machine.
Share
Improve this answer
Follow
answered Aug 19, 2013 at 12:32
SasidharanSasidharan
3,70433 gold badges1919 silver badges3737 bronze badges
Add a comment
|
|
Code which I m using for button
protected void btn_backup_Click(object sender, EventArgs e)
{
try
{
Class_Backup objbackup = new Class_Backup();
objbackup.BackUpPath = "SalvageManager" + DateTime.Now.ToString().Replace("/", "_").Replace(":", "_").Replace("-","_").Replace(" ","_");
objbackup.BackupData();
if (objbackup.OperationStatus != false)
{
Response.ContentType = "application/bak";
Response.AppendHeader("Content-Disposition", "attachment; filename=" + objbackup.BackUpPath);
Response.TransmitFile(Server.MapPath("~/DataBaseBackUp/" + objbackup.BackUpPath));
Response.End();
//Response.Redirect(Server.MapPath("~/DataBaseBackUp/" + objbackup.BackUpPath),false);
}
else
{
lbl_message.Text = objbackup.ErrorMessage;
}
}
catch(Exception ex)
{
lbl_message.Text = ex.Message;
}
}
When I click on button following error occured
Error : Cannot open backup device 'D:\INETPUB\VHOSTS\salvagemanagers.com\httpdocs\DataBaseBackUp\SalvageManager8_19_2013_12_18_01_PM'. Operating system error 3(The system cannot find the path specified.). BACKUP DATABASE is terminating abnormally.
|
How to get database backup on running website on click of a button using asp.net
|
0
Why are you discarding link-dest? I use a script with that option and take snapshots pretty often and the performance is pretty good.
In case you reconsider, here's the script I use: https://github.com/alvaroreig/varios/blob/master/incremental_backup.sh
Share
Improve this answer
Follow
answered Jul 4, 2013 at 17:42
argarg
18711 silver badge1212 bronze badges
1
Oh I'm not saying it's a bad option, but if that's all I'm doing then cp -lR should have less overhead than rsync, I was just wondering if there were any faster methods as for a very large structure even cp -lR can be extremely slow. The main advantage of rsync is that you can do other stuff at the same time with a single command, such as copying in updated files in addition to linking the rest.
– Haravikk
Jul 28, 2013 at 10:28
Add a comment
|
|
Okay, so I need to create a copy of a file-structure, however the structure is huge (millions of files) and I'm looking for the fastest way to copy it.
I'm currently using cp -lR "$original" "$copy" however even this is extremely slow (takes several hours).
I'm wondering if there are any faster methods I can use? I know of rsync --link-dest but this isn't any quicker, but really I'd want it to be quicker as I want to create these snap-shots every hour or-so.
The alternative is copying only changes (which I can find quickly) into each folder then "flattening" them when I need to free up space (rsync newer folders into older ones until the last complete snapshot is reached), but I would really rather that each folder be its own complete snapshot.
|
Faster Alternatives to cp -l for a Whole File Structure?
|
Ther're actually more parameters to metadata_filter (nice example of a doc bug):
SQL> desc dbms_datapump.metadata_filter
Parameter Type Mode Default?
----------- -------- ---- --------
HANDLE NUMBER IN
NAME VARCHAR2 IN
VALUE VARCHAR2 IN
OBJECT_PATH VARCHAR2 IN Y
OBJECT_TYPE VARCHAR2 IN Y
HANDLE NUMBER IN
NAME VARCHAR2 IN
VALUE CLOB IN
OBJECT_PATH VARCHAR2 IN Y
OBJECT_TYPE VARCHAR2 IN Y
...and I believe you'll have to qualify the object type you're filtering for:
object_type => 'TABLE'
update after you provided the function source:
Remove two apostrophes from each side of the filter values.
Double apostrophes are required by the compiler only. The value of the filter parameter must contain single apostrophes, but your function creates them in pairs.
|
My situation is quite strange, and i don't have any ideea on how to handle it. Scenario:
In variable v_tables_param I have the following string (the names of the tables that iwant to export) 'IN(''REPORT_PERIOD'',''OBJECT_AVAILABILITY'')'.
when i try to specify the following metadata filter that i need in order export the tables :
DBMS_DATAPUMP.METADATA_FILTER(handle => n_h1, name =>'NAME_EXPR',value =>v_tables_param);
i get a ORA-39001: invalid argument value.
However, if i hrad code the exact value of v_tables_param into the metadata filter, it works like a charm :
DBMS_DATAPUMP.METADATA_FILTER(handle => n_h1, name =>'NAME_EXPR',value =>'IN(''REPORT_PERIOD'',''OBJECT_AVAILABILITY'')');
Any idea what is happening here?
Are there some weird scenarios in oracle when a hard coded string is different from a variable that has the same value ?
EDIT: I added the function that computes the value of v_tables_param
FUNCTION SPLIT_TABLES(
v_tables_list VARCHAR2 --this is a string that looks like "table1,table2,table3"
) RETURN VARCHAR2
IS
n_idx PLS_INTEGER;
n_i PLS_INTEGER := 0;
v_tables VARCHAR2(2000) := v_tables_list;
v_filter_value VARCHAR(2000);
v_current_table VARCHAR2(200);
BEGIN
v_filter_value := '''IN(';
LOOP
n_idx := instr(v_tables,',');
IF n_idx > 0 THEN
v_current_table := (substr(v_tables,1,n_idx-1));
v_filter_value := v_filter_value || '''''' || v_current_table || ''''',';
v_tables := substr(v_tables,n_idx+1);
n_i := n_i + 1;
ELSE
v_current_table := v_tables;
v_filter_value := v_filter_value || '''''' || v_current_table || ''''')''';
EXIT;
END IF;
END LOOP;
RETURN v_filter_value;
END SPLIT_TABLES;
|
Oracle datapump exported tables
|
0
For each file that is being stored to the file system, you have to add "Do not backup" attribute. See this answer: Adding the "Do Not Backup" attribute to a folder hierarchy in iOS 5.0.1. To check the system version you can call [[UIDevice currentDevice] systemVersion], or use macros like __IPHONE_5_0 with #if defined() and #ifndef. Good luck!
Share
Improve this answer
Follow
edited May 23, 2017 at 11:56
CommunityBot
111 silver badge
answered Oct 19, 2012 at 9:29
Fahri AzimovFahri Azimov
11.7k22 gold badges2222 silver badges3030 bronze badges
2
That's what i was doing before but now that i'm running on iOS 6 devices, the setxattr method doesn't return 0 anymore. So this method is not enough anymore.
– Mick F
Oct 19, 2012 at 9:39
Thanks for the __IPHONE_5_0 tip! (as of the systemVersion, i reckon it is the most straightforward way to get the iOS version but i don't like it as it requires some NSString parsing and some expectation of what the yet-to-come iOS versions names will be)
– Mick F
Oct 19, 2012 at 9:41
Add a comment
|
|
My app needs to download a decent amount of contents from a server to be able to be functional.
Apple guidelines about data storage mentions that this kind of data, "needed-to-work-but-easily-refetchable" shouldn't be included in iCloud/iTunes backups: fair enough.
The tricky part is that the code to prevent a directory to be backed up is different between iOS 5.0, 5.0.1 and 5.1 (cf. this technical note).
My app currently supports iOS 5.0 as a deployment target.
What should I do between the different following options:
set the deployment target to 5.1 (straight-forward but i can't find data about proportion of users still being in iOS 5.0 and 5.0.1 to be comfortable introducing the subject to my boss)
implementing both 5.0.1 and 5.1 codes provided by Apple but it raises some issues:
my usual way to detect if a device is running a speficic iOS version is to use respondsToSelector: with a selector introduced in the iOS version i'm targeting but iOS 5.1 seems to introduce constants and not-universal classes only. How to be sure i'm running iOS 5.1 or later?
what about devices running iOS 5.0? storing data into Caches would be super annoying to deal with both for the development team and the user experience
Any other option to recommend?
|
How should I prevent files from being backed up to iCloud and iTunes on iOS 5.0?
|
unfortunately some editors are not supporting of crtl + Z so only not able to recover the data..
|
I have a script file. Unfortunately I've overridden it with some other data. I need the old data. crtl+Z is not working.
How do I recover it?
|
Get overridden file data?
|
3
Look into RSync unles you need version control.
http://en.wikipedia.org/wiki/Rsync
Share
Improve this answer
Follow
answered Aug 20, 2012 at 2:13
awiebeawiebe
3,78844 gold badges2222 silver badges3333 bronze badges
1
Rsync only copies modifications one way. The OP probably wants changes on the iMac to be copied to the MBP, and vice versa.
– echristopherson
Aug 20, 2012 at 18:51
Add a comment
|
|
Closed. This question is off-topic. It is not currently accepting answers.
Want to improve this question? Update the question so it's on-topic for Stack Overflow.
Closed 11 years ago.
Improve this question
Here's my idea (which I'm currently testing as we speak):
I just got a new macbook pro to use on the road, but I now have the problem of keeping it in sync with my main iMac easily. I'm very familiar with git and just see it as much easier to use and manage than other utilities such as rsync because I can really see what the changes I've made are. I know git was not aimed and syncing these types of files but my idea was to init a git repo on the root directory of both computers, then create a bare directory on an network drive to act like my local github. I could then push and pull to this remote directory from both computers and maybe even setup tunneling so that I wouldn't have to be on the local network to sync. Besides Git not being aimed at this type of work are there any other problems with this that can be foreseen? The initial commit and push will probably take forever but I feel that generally after that the time won't be too bad since git has such a good track of what files change. Just to put it in perspective, I initialized the git repos on both machines and ran a git add -A about 5 minutes ago and its still running.
Just wondering what other people's ideas about this are and if its ever been tried before.
Thanks,
Bryan
|
Use git as a means to sync/backup two macs [closed]
|
The answer is in the question. Install SVN (or another SCM).
And at least eclipse has a local history, in case you need to go back to something you haven't committed. Don't know about other IDEs.
|
Closed. This question does not meet Stack Overflow guidelines. It is not currently accepting answers.
Questions asking us to recommend or find a tool, library or favorite off-site resource are off-topic for Stack Overflow as they tend to attract opinionated answers and spam. Instead, describe the problem and what has been done so far to solve it.
Closed 9 years ago.
Improve this question
I need a software which can save all the previous versions of a file, with some kind of "auto commit". I don't want to use a remote server like github etc.
I am one of those noob programmers, who try to do everything by trial and error. But, sometimes, a change in the source code screws up things even more and I need to go back to an older version of the source. Today was even worse. My hard worked code got filled up completely with null bytes after a power faliure.
|
"subversion"-like setup on Windows 7 [closed]
|
5
Don't do this — you are re-inventing existing tools!
Invoke mysqldump instead, which is expressly designed for the purpose.
With the proper permissions, you can use PHP's system or exec to invoke this.
Share
Improve this answer
Follow
answered Mar 25, 2013 at 10:50
Lightness Races in OrbitLightness Races in Orbit
382k7777 gold badges653653 silver badges1.1k1.1k bronze badges
3
Indeed, mysqldump utility is made for backup purposes (and it is fastest way to backup a database).
– Bud Damyanov
Mar 25, 2013 at 10:51
I am aware of my mysqldump, unfortunately im am a student and my teacher wants the backup performed this way.
– user2180246
Mar 25, 2013 at 10:54
1
@user2180246 How arbitrary.
– Lightness Races in Orbit
Mar 25, 2013 at 11:10
Add a comment
|
|
I am trying to perform php/mysql backups
I receive values from a form page and then with the command "select tables", i save those values in a array.
After that i do a "for" loop to backup each table:
<?php
$dbname = $_POST['txt_db_name'];
$tbname = $_POST['txt_tb_name'];
$ligacao=mysql_connect('localhost','root','')
or die ('Problemas na ligação ao Servidor de MySQL');
$res = mysql_query("SHOW TABLES FROM pessoal");
$tables = array();
mysql_select_db($dbname,$ligacao);
while($row = mysql_fetch_array($res, MYSQL_NUM)) {
$tables[] = "$row[0]";
}
$length = count($tables);
for ($i = 0; $i < $length; $i++) {
$query=
"SELECT * INTO OUTFILE 'pessoa_Out.txt'".
"FIELDS TERMINATED BY ',' ".
"ENCLOSED BY '\"'".
"LINES TERMINATED BY '#'".
"FROM $tables[$i]";
$resultado = mysql_query($query,$ligacao);
}
mysql_close();
if ($resultado)
$msg ='Sucesso na Exportaçao da Database '.$dbname.' ';
else
$msg ='Erro Impossivel Exportar a Database '.$tbname.' ';
?>
|
Backup an entire database with PHP/MYSQL
|
2
You can do it by GUI or by a script just like that
RESTORE DATABASE [AdventureWorksNew]
FROM DISK = N'\\nas\Backup\L40\SQL2005\AdventureWorks_backup_200702120215.bak'
WITH FILE = 1,
MOVE N'AdventureWorks_Data' TO N'C:\Data\MSSQL.1\MSSQL\Data\AdventureWorksNew_Data.mdf',
MOVE N'AdventureWorks_Log' TO N'C:\Data\MSSQL.1\MSSQL\Data\AdventureWorksNew_Log.ldf',
NOUNLOAD, STATS = 10
Read it Here
Share
Improve this answer
Follow
answered Dec 24, 2010 at 18:45
Jahan ZinedineJahan Zinedine
14.7k55 gold badges4646 silver badges7272 bronze badges
Add a comment
|
|
how can i import a .BAK file to my sql server 2005 ??
i have a backup data base made on sql and i want to import it on sql server 2005. how can i do it !!!
thanks
|
import .bak to sql server
|
4
You haven't specified what type of database so I'll presume something SQL-ish and accessible over JDBC. If this is the case, DBBackup should do what you want. That said, you might want to use the database-native tool (for MySQL, that's mysqldump; for PostgreSQL, pg_dump) as it'll probably be quicker, more reliable and so on.
Share
Improve this answer
Follow
answered Jun 7, 2010 at 11:00
Tom MorrisTom Morris
3,97922 gold badges2525 silver badges4343 bronze badges
Add a comment
|
|
Can we take backup of our current database through java programming ?
|
Can we take backup of our current database through java programming?
|
Git works differently to Subversion. In Subversion tags and branches are implemented as different paths. In Git branches and tags are not different paths, but labels/handles to a specific revision (with all its history).
In order to make a complete backup it is sufficient in Git to copy/backup the whole working tree (OR cloning into a bare repository to save space, because then there is no checkout, OR pushing your changes to a remote repository e.g. on GitLab/GitHub).
Every Git repository always (in the normal case) contains ALL history. This works, because Git is a distributed VCS and basically all repositories are equal and none is special (such as in Subversion). There is no central place (sometimes one place is defined to be the central exchange place, but that's not necessary).
|
I'm new in Git but I have some experiences on SVN.
I want to make the backup from my project into another computer (for example mine).
In SVN there is a project with branches and tages and for sure trunk.
example url: https://svn.example.com/project
When I checkout it I can have a complete copy of the project with tags, branches and etc, so I can zip all and copy each day as a back up on other computer.
Is it something like that on Git? I tried but I couldn't have all of the code in the same time but I can have each part by switch/checkout.
I'm using TortoiseGit.
If you have any other solution please tell me.
Thanks in advance.
|
Git backup vs SVN
|
This simple script worked for me.
#!/bin/bash
DW=`date +%a`
DM=`date +%d`
cd /home/
for file in *;
do
tar -czvf /databackup_Allin/$DW/home/$file.tar.gz $file;
done
|
I have multiple directory in /home/ path and i want to make script to make .tar for each folder and save into another location.
After googling what i have found is:
find /home/ -type d -maxdepth 1 -mindepth 1 -exec tar cvf {}.tar {} \;
but my problem is this is creating .tar files in /home and after that i have to move them to another location which is taking long time.
what i want is, tar file should be create in another location not in /home/ directory.
Thanks:
Akash
|
How to create each directory tar in a directory and save to different localtion
|
when trying to run:
zip -qr F:\Backup\20140618201605.zip C:\python
I get the:
SyntaxError: invalid syntax
This is because that's a shell command, not Python. You'll never be able to enter that in the Python shell. You should try it in a Command Prompt window.
Once you have this working manually, you can discover what errors the Python version is probably running into - it sounds like the zip command either doesn't exist (on your path) in which case you should add an entry to your path for that zip program, or it's invalid syntax that's producing an error return code.
|
Am starting to learn python 3. Am reading from "A byte of Python" ebook.
I got the above error when trying to run the following program:
# Filename: backup_ver1.py
import os
import time
source = ['C:\\python']
target_dir = 'F:\\Backup'
target = target_dir + os.sep + time.strftime('%Y%m%d%H%M%S') + '.zip'
zip_command = "zip -qr {0} {1}".format(target, ' '.join(source))
if os.system(zip_command) == 0:
print('Successful backup to', target)
else:
print('Backup FAILED')
The program prints: Backup FAILED but when trying to run:
zip -qr F:\Backup\20140618201605.zip C:\python
I get the:
SyntaxError: invalid syntax
All routes and folders DO exist, however the idle makes the "F" red!!
This is a screenshot of the error:
|
(python 3) SyntaxError: invalid syntax
|
SkyDrive has a file size limit in which you run into. The web version allows 100MB and the Desktop client supports 2GB. So this is not possible.
Create a backup and store it on an external USB 3.0 HDD or Flash drive.
|
Closed. This question does not meet Stack Overflow guidelines. It is not currently accepting answers.
This question does not appear to be about a specific programming problem, a software algorithm, or software tools primarily used by programmers. If you believe the question would be on-topic on another Stack Exchange site, you can leave a comment to explain where the question may be able to be answered.
Closed 8 years ago.
Improve this question
I purchased a HP Pavalion preinstalled with windows 8 OS. No external media was provided with the laptop. Is it possible to store complete windows 8 backup in cloud such that I can perform clean windows 8 install in future,if necessary?
|
Creating windows 8 backup on skydrive [closed]
|
The only way i think you can restore your single file is to restore the full backup on some test server. The best practice to be followed while handling document is not taken care in your case hence that seems to be only solution
|
Is there a way to restore a single document library and/or site from a full backup?
I need to restore a single file actually, but find that our vendor had set us up with only a full backup and an incremental backup strategy that backs up the entire farm.
|
Restore a single document library from full backup?
|
2
Use mysqldump. Example (with the options that I usually use):
mysqldump --single-transaction --hex-blob --opt -e --quick --quote-names -r put-your-backup-filename-here put-your-database-name-here
Share
Improve this answer
Follow
answered Jan 13, 2010 at 8:55
Emil VikströmEmil Vikström
91k1616 gold badges142142 silver badges175175 bronze badges
2
Do I need to create a batch file and make it run it every hour? I am new to mysqldumb, can you elaborate more on this?
– mysqllearner
Jan 13, 2010 at 9:07
Yes, do it in a .bat file. Mysqldump is run from the commandline (e.g. "DOS" window).
– Emil Vikström
Jan 13, 2010 at 9:23
Add a comment
|
|
I am using php, mysql and my server is Windows Server 2003 with IIS6.
I am planning to backup my database on hourly basis. I can do the cronjob, tested by write date&time into a log file. It works perfectly.
Now I want to use the cronjob to backup my databases. How do I do that? Create a php script and let it run every hour??
|
How to create MySQL back up executable file?
|
When creating a project angular creates a git source tree and install all of the dependencies, known as node modules.
What takes time to copy is your dependencies.
What you wish to keep is the project without the dependencies.
Believe it or not, huge coincidence, Angular also provides a .gitignore file which lists the dependencies as a folder to ignore.
So basically, all you have to do is use git to save your project, basically like any developper would do.
Or you can copy the whole folder, minus the node_modules, like madmen do, it's really up to you.
|
I'm new to angular and I want to create a backup of a project so I have a copy that is working for future reference.
I tried ctrl-c ctrl-v on the project folder but it takes ages and is probably a waste of disk space.
Any ideas?
I haven't applied git to the project folder yet.
|
How to backup an angular project?
|
Try adding --modify-window=1. From the rsync man page:
--modify-window
When comparing two timestamps, rsync treats the timestamps as
being equal if they differ by no more than the modify-window
value. This is normally 0 (for an exact match), but you may
find it useful to set this to a larger value in some situations.
In particular, when transferring to or from an MS Windows FAT
filesystem (which represents times with a 2-second resolution),
--modify-window=1 is useful (allowing times to differ by up to 1
second).
|
rsync - same size, content, owner, group, permissions, time - yet it wants to copy
Using macOS Mojave. Using the default filesystem in the source dir (apfs per df?), FAT32 in the dest dir (msdos per df).
Run out of ideas.
|
rsync - same size, content, owner, group, permissions, time - yet it wants to copy
|
2
No. There is no simple "grep" of the log file from version 9 or earlier that will reveal this information.
Prior to version 10 log file entries only contain the time. Not the date. When the day rolls over an entry is written but there are many situations where that date will not be available even for more complex parsing than a simple grep.
Bonus answer: version 9 is ancient, obsolete and unsupported. You should upgrade.
Share
Improve this answer
Follow
answered Jul 13, 2018 at 11:52
Tom BascomTom Bascom
13.5k22 gold badges2828 silver badges3434 bronze badges
1
Nice! Didn't know that about v9. While your bonus answer is valid, but poor asker may have his hands tied by with legacy code. Lastly, using the last modified timestamp on the file and counting back through the file, a script could be written to determine date.
– zr00
Dec 6, 2018 at 21:09
Add a comment
|
|
Is there a script available that can pull the PROGRESS BACKUP STATUS INFO from the log file?
I did a grep on “BACKUP” and it pulls the correct information, but unfortunately there is no time date stamp associated. we are using progress version 9.1D
The end goal would look like this.
Selection Date Range:
Start Date: 07/6/2018
End Date: 7/7/2018
Output:
Date: Mon July 6 20:00 2018
20:03:28 BACKUP 10: Full Backup Started
20:51:44 BACKUP 10: Full Backup Successfully Completed
Date: Mon July 7 20:00 2018
20:03:28 BACKUP 10: Full Backup Started
20:51:44 BACKUP 10: Full Backup Successfully Completed
|
unix scripting to grep the selection date range?
|
BCP is a command line tool. You run it in the command prompt, not in SQL Editor. You can, at least in theory, run it from inside the SQL using xp_cmdshell but it might not be enabled for you, and it's usually a lot easier to handle errors etc. when you do it for example in a .bat file and not try to do it inside the database.
|
I am trying to export some tables to a file as a backup.
To do this I am using the bcp utility from SQL Server. I work on SQL Server 2008, but despite the fact it should theoretically work, when I write on the SQL editor window it doesn't seem to recognize the command...
Does somebody have any command to do backup from a table? This way I could create a procedure to do this massively.
|
bcp does not work in SQL Server 2008
|
We have been using UpdraftPlus for a long time and they are great. They allow you to backup to Google Drive or Dropbox or download directly. It is a 360 degree backup solution and you can backup multiple websites from a central location by using a WordPress Multi-Site setup. They also have a free plan.
https://updraftplus.com/
https://updraftplus.com/updraftplus-full-feature-list/
https://wordpress.org/plugins/updraftplus/
Also, I have heard many good things about BackupBuddy by iThemes, although they cost much more. Depending on your budget it may also be useful to look into this more:
https://ithemes.com/purchase/backupbuddy/
|
Hi I have a lot of wordpress sites (more than 30) and I'm currently using MainWP plugin, but it is taking too much resources and sometimes it takes my server down.
Does anyone knows if there is a plugin that backups several sites in one wp admin console like mainWP?, I know that there are several options but they dont have the option to control every site in just a single WP panel
|
Backing up my wordpress sites
|
You would need to do a cron job or launchd to do the automation but you could have it run your PHP script without a problem. All it would do is a directory recursive iterator to copy and make directories in a backup location. PHP has a ZIP library, if installed on your server, to create compressed files if so desired. See this page. This could be as complex a you make it. You may want to store the files on a remote server which you could use the FTP features in php to send with.
|
Closed. This question needs to be more focused. It is not currently accepting answers.
Want to improve this question? Update the question so it focuses on one problem only by editing this post.
Closed 9 years ago.
Improve this question
I want to create a backup script to automatically run 3 times: monday of every week,first day of 1 of each month,and once per 3 months.My logical thinking,isnt that good,so i need your help guys.The code has to be in php,and the easiest way is with if.
|
backup date weekly,monthly,quarterly in php [closed]
|
You can use pg_dump --schema-only --format=plain > dump_file.sql to dump your schema (db objects without data) into a sql file.
Details here: pg_dump.
|
I have a pretty big db in the terms of amount of different db object, not the size. I want to backup it (its structure, scheme). Preferably, I'd like to get the sql code. Of course, I can navigate to each its object and just get it but, as I said, there are a lot of them.
How do I do this easily?
|
Backup db scheme (structure) easily
|
bup:
Highly efficient file backup system based on the git packfile format. Capable of doing fast incremental backups of virtual machine images.
AFAIK, an important bup's limitation currently is its inability to purge old data.
I should add that from my personal experience rsync (in the "preserve hard links" mode) + rdiff-backup work just OK for backing up mostly unchanging data. dirvish is reported to also be OK, and so does unison (though I'd say it's mostly a synchronisation tool rather than a backup solution).
You might also take a look at obnam and BackupPC — they use different approaches but perform data deduplication largely providing for the same outcome you'd get with Git and its "reusing" of blobs which are already in the repository.
Note that certain "next gen" filesystems, such as zfs and btrfs might do file-level (and/or block-level?) deduplication by themsevles, and hence backing up mostly unchanging data might be done by just copying it over and over to a set of per-backup-event directories on a single instance of such filesystem.
|
It has just occurred to me that git (or a git wrapper of some sort) might be a good tool for system backups or hard drive backups. Why am I thinking that, I am not sure, as I am a git newbie and have no idea how efficient, reliable or straightforward it is with huge repositories, how well it compresses or whether it is friendly towards external compression, etc; just thinking it might be neat. Is it feasible? Desirable? Has anyone implemented it? Is there a straightforward way to implement it, preferably using open source software? (system backups can be tricky).
|
System backup into a git repository?
|
2
Just backup the directory
Win: %APPDATA%\FileZilla {Type "%APPDATA%\FileZilla" in RUN}
Linux, Mac, other: ~/.filezilla
Once You install the Fresh Copy of filezilla just replace the new created folder with the backed-up one, or Open FileZilla after install and click File > Import > Navigate to backup filezilla folder and select the file named "sitemanager.xml" > click ok
& Enjoy ;)
Share
Improve this answer
Follow
answered Dec 4, 2013 at 5:48
Samrat SahaSamrat Saha
59711 gold badge77 silver badges2020 bronze badges
Add a comment
|
|
If FileZilla got Corrupted Somehow. So how to backup manually the existing ftp details that are been with filezilla ?? So that i can use the details after i re-install the fresh filezilla copy.
|
Manually Backup FileZilla ftp details if FileZilla got Corrupted
|
2
What server do you use? In Linux you can take dump of mysql with the command mysqldump and put it in the cronjob for continuous backuping. And also with bash script, you can name your files as the backing up date.
Share
Improve this answer
Follow
answered Dec 2, 2012 at 22:55
0xmtn0xmtn
2,66055 gold badges2727 silver badges5353 bronze badges
1
You can use both commands in plesk server too
– 0xmtn
Dec 6, 2012 at 18:45
Add a comment
|
|
As it currently stands, this question is not a good fit for our Q&A format. We expect answers to be supported by facts, references, or expertise, but this question will likely solicit debate, arguments, polling, or extended discussion. If you feel that this question can be improved and possibly reopened, visit the help center for guidance.
Closed 11 years ago.
I'd like to know what options do you recommend when it comes to backing up the information stored in MySQL databases.
Maybe there are some ready solutions/plugins available that I can install on my server to run daily/weekly backups?
Thank you!
|
Ways to backup MySQL databases regularly? [closed]
|
Copying a DBF will fail if anyone has it open EXCLUSIVEly.
We do use robocopy here as a secondary backup, but we schedule it to run at (IIRC) 3am, when no live users are in the system and automated processes are likely to be finished. We don't entirely trust a copy made during working hours when users are writing to tables a lot - no way to know whether the resulting tables will be consistent.
Also, you'd have to worry about the .CDX and .FPT (if applicable) being out of sync with the .DBF, since they'd be copied at a different time.
So the answer to your question is "Probably not, unless you can be sure no writes are taking place (and of course the tables are not opened exclusively)."
|
Closed. This question is off-topic. It is not currently accepting answers.
Want to improve this question? Update the question so it's on-topic for Stack Overflow.
Closed 11 years ago.
Improve this question
Is it safe to make a copy (via cmd with copy or robocopy, or by copying and pasting it) for backup purposes of a file-based multiuser Visual FoxPro datbase while other users are accessing it?
|
Safe to copy a file-based multi-user Visual FoxPro database while users are accessing it? [closed]
|
2
you need to create a scheduled task in windows control panel, and pass it the mysqldump command
something like this
pathtomysql/mysqldump -h DB_HOST -u DB_USER -pDB_PASS db_name > local_filename
if you want to do that in php then you can as well
create a scheduled task in windows control panel, and pass it the path to your php script
something like this
pathtophp/php.exe pathtoscript/myscript.php
Keep in mind it probably wont use the same php.ini as the webserver, so your config might be different
you can all this to backup your database from php
system("/pathtomysql/mysqldump -h $host -u $user -p$pass $db_name > $tofilename 2>error.txt", $ret);
this also pipes the any error messages to error.txt
and you can read that file and email them to yourself if you feel the need
Share
Improve this answer
Follow
edited Apr 19, 2011 at 5:10
answered Apr 19, 2011 at 5:05
bumperboxbumperbox
10.2k66 gold badges4444 silver badges6666 bronze badges
Add a comment
|
|
As it currently stands, this question is not a good fit for our Q&A format. We expect answers to be supported by facts, references, or expertise, but this question will likely solicit debate, arguments, polling, or extended discussion. If you feel that this question can be improved and possibly reopened, visit the help center for guidance.
Closed 11 years ago.
how to take the automatic Mysql database dump daily using PHP on Windows?
|
MySql Database Dump [closed]
|
You've got SMO in the title of your question. Why not just use SMO for backup and restore?
|
Anyone knows any library for SQL Server database backup and restore for .NET?
This actually is needed to avoid writing one new. There is so many people out there giving their libraries, but i found it strange i couldn't came up with googling anything related.
|
Any SMO Library for .NET?
|
Subsets and Splits
No community queries yet
The top public SQL queries from the community will appear here once available.