Streamlining IT Support in NJ: Tips for Efficient and Effective Database Services

Streamlining IT Support in NJ: Tips for Efficient and Effective Database Services

If you’re a small to midsized business (SMB) looking for comprehensive third-party IT support in NJ, you might have noticed a problem: many managed service providers (MSPs) do not specifically offer database administration (DBA) services. The care and feeding of databases is business-critical and takes special skills. Yet MSPs tend to lump DBA services in with other IT administrative functions.

For example, here is a “menu” of services available from an MSP offering IT support in NJ:

Popular IT Support Services in New Jersey

  1. Help desk
  2. 24×7 remote monitoring and management of your IT environment
  3. On-site IT support as needed
  4. Offsite backup/storage
  5. Email security
  6. Managing your public cloud footprint
  7. Training on cybersecurity awareness and other topics

Those services cover a lot of ground, including everything from computers and mobile devices to your business networks to your company internet to cybersecurity. But what about DBA activities, including specialized tasks like performance tuning, database development, database security, managed database hosting, etc.? Possibly those are covered under service #2… But how thoroughhgoing is the actual service? Are the people watching your database certified DBA experts? Or general-purpose “IT guys” who may or may not know your database environment, yet have your most valuable and sensitive data in their hands?

IT Support NJ: Ensuring first-rate DBA services

Whether you have mission-critical data housed in Oracle, Microsoft SQL, MySQL, PostgreSQL, or another database environment, your MSP could very well be using jack-of-all-trades IT systems administrators to manage your database environment—not specialist DBAs. That might be fine if your database estate is simple, or you’re lucky enough to have in-house know-how to backstop your MSP. Or you might find out the hard way that your MSP’s current level of DBA expertise is not enough, by experiencing inefficient database operations, application downtime, compliance issues, data loss, and/or security vulnerabilities.

What’s the alternative for SMBs seeking comprehensive IT support in NJ or the New York City metro area, including expert DBA services? You can augment your MSP’s IT support with an outsourced NJ-based DBA who functions as an extension of your MSP’s team. Or you can find an MSP that has certified DBAs onboard. Either way, you keep all the managed service benefits, like cost savings and on-demand flexibility, while improving IT efficiency and reducing database-related business risk.  

Should you consider an outsourced DBA partner?

Regarding your current level of DBA support, does your database infrastructure run smoothly with little to no downtime, slowdowns, or other issues that frustrate users? Do your users, customers, etc. enjoy application performance and reliability that meets agreed service levels? Is your database environment proactively managed to address potential problems before they impact users? Is your data helping you meet business goals?

If the answer to any of these questions is no, you’re answering any of these in the negative, consider adding a specialist DBA partner to your managed IT support in NJ. You need to feel confident that whoever is managing your database knows exactly what they’re doing. Your database is too important to trust to chance. 

A DBA partner can work with your business directly, alongside your MSP, to provide the great database support your business needs and deserves. 

Streamlining IT Support in NJ: What’s next?

If you’re looking for IT support in NJ that includes reliable, cost-effective, expert DBA services, contact Buda Consulting to schedule a free 15-minute call. We can help you get more value from your data and your database investments.

5 Tips for a Successful Oracle Database Migration

5 Tips for a Successful Oracle Database Migration

Companies need to migrate Oracle data for a long list of reasons, such as moving workloads from on-premises to a cloud or managed hosting environment, implementing a new system, or launching a big data initiative. But whatever their aim, Oracle database migration projects are widely known to pose a major risk of failure and budget overruns. 

A big contributing factor is the misconception that these are simple data-moving operations. In fact, Oracle database migrations are often complex and require careful planning. Data migration services and technology alone are not a guarantee of success.

This post shares five tips for a successful Oracle database migration.

One: Recognize that Oracle database migrations are a business problem.

Any Oracle database migration is a data problem—and therefore a business problem—first, and an IT problem second. Not the other way around as conventionally assumed.

Without engagement from business and technical leaders, the chances of failure are very high. It’s important to get input from management stakeholders and to know upfront that adequate project resources will be allocated.

Too often, a lack of business involvement and commitment results in an Oracle database migration project that is fraught with unknowns and incorrect assumptions. These later manifest as data loading problems, functional testing failures, and other stumbling blocks leading to time and cost overruns. Eleventh-hour emergencies may also lead to engaging third-party Oracle data migration services without adequate due diligence.

Two: Don’t underestimate the scope of your Oracle database migration.

A big reason Oracle database migration projects falter is that they weren’t correctly scoped in the first place. Many organizations underestimate the effort required to migrate an Oracle database successfully. 

Some of the reasons include:

  • Lack of understanding of the current data landscape
  • Lack of awareness of current data quality issues
  • Viewing the migration as a simple data-moving exercise
  • Failure to get input from business stakeholders on their needs during the project
  • Inadequate data migration services, tools, and/or expertise to support the migration

Plus, like many IT-related projects, Oracle database migrations are subject to scope creep and specification changes. The better you can analyze your current data, including its volume, data types, etc., the better you can scope the migration. 

Three: Focus on data quality from the outset.

“Garbage in, garbage out” is an inescapable reality when it comes to data. Why spend money moving data to a new environment or a new application if it isn’t usable? Yet it’s often when an existing database is migrated that errors, gaps, corruption, redundancies, and formatting issues rear their heads. 

Even if the data was acceptable for its prior uses, it might not meet your new objectives, such as access by new applications. A successful Oracle database migration requires an upfront emphasis on delivering accurate data that meets business needs. Finding out at the last minute that data requires cleansing is sure to lead to project delays, budget crunches, and specification changes. 

Four: Leverage appropriate Oracle technology.

From data cleansing to data movement to data governance, purpose-built Oracle technology can help automate your Oracle database migration to save effort and improve consistency and repeatability. Choosing Oracle tools also adds to the value of your Oracle investment.

For example, if you’re moving an on-premises Oracle database to the Oracle Cloud, Oracle offers a wide range of data migration services and tools to help you migrate data into your target cloud service (e.g., Oracle Autonomous Database, Hadoop, or Object Storage). These include:

  • OCI Data Transfer Service, a low-cost data migration service that accelerates moving even large-scale datasets to or from Oracle Cloud.
  • Zero Downtime Migration (ZDM) for more efficient migration of on-premises databases to the Oracle Cloud leveraging high availability technologies like Oracle Data Guard and Oracle GoldenGate. 
  • Oracle Data Pump to move data between Oracle databases via a choice of methods, including in the cloud and between on-premises and cloud.

Five: Leverage appropriate data migration services.

“We don’t know what we don’t know.” Thus, businesses may be unaware of their Oracle database migration challenges until they are blindsided in midstream. 

While data migration services can help reduce time and cost impacts in these situations, they can also be of value upfront by helping you avoid or prepare for them. Data migration services can help you sort out your best options for moving Oracle workloads from your data center to a public cloud platform or managed database hosting provider, for instance. This starts you off on the right foot and helps eliminate risks to project success.

Data migration services can also give you on-demand access to specialized Oracle expertise that many businesses don’t have in-house. Unbiased, third-party experts can save time with valuable insights, as well as champion the best course of action from a range of options.

What’s next?

If you’re thinking of moving Oracle databases to a cloud or managed hosting environment, Buda Consulting can help you choose the best option for your workloads. We can also handle the complete migration process for you, from installing and configuring your new Oracle environment to migrating your data. All while minimizing downtime and business risk.

Contact us for a free “database discussion” to explore your Oracle database migration goals and concerns.

5 Ways Remote DBA Services Can Help Your Business

5 Ways Remote DBA Services Can Help Your Business

Many SMBs today look to establish and maintain a core IT team that has the expertise and agility to respond to business demands, such as developing and updating custom applications. For non-core/maintenance processes like database administration, there are many benefits to partnering with third-party experts who can handle these time-consuming, specialized tasks reliably and efficiently. This is why remote DBA services have become one of the most prevalent forms of IT outsourcing.

What are remote DBA services?

Remote DBA services can cover a wide range of everyday database tasks, including:

  • Database administration (e.g., user administration and space management)
  • Performance monitoring and tuning
  • 24×7 database support 
  • Report development
  • Database health checks and troubleshooting
  • Database security monitoring
  • Database upgrades

Here are 5 ways that remote DBA services can benefit your business.

One: Save money and reduce business risk associated with hiring skilled technical resources

Demand for skilled DBAs exceeds supply and salaries are averaging well above $100,000 plus benefits. Hiring qualified technical staff is time-consuming, challenging for HR staff, and risky due to high turnover and potential competence issues.  The cost to engage remote DBA services is generally 40% to 60% lower than hiring a full-time employee. Further, a remote DBA services team can offer specialized skills and up-to-date product knowledge on demand that the individual you hire may not have.

Two: Better staff continuity

Ironically, high turnover and poor retention of skilled technical resources mean that many businesses get better staff continuity and knowledge transfer by leveraging remote DBA services than by hiring in-house DBAs.  At a minimum, leveraging remote DBA services means you can rest assured of continuity of services, without the risk of losing a key DBA and being left without that critical resource. You’re also less likely to be impacted by inevitable issues like vacations, maternity leave, illness, etc. Many remote DBA services offer 24×7 coverage for your environment, with rapid response in emergencies.

Three: Improved database operations

Especially for SMBs struggling to cover critical database tasks, relying on remote DBA services can improve your database availability and performance. Remote DBAs have your database environment as their top priority and can be more proactive about improving service levels versus reacting to problems in a break-fix manner when they manifest. For example, database health checks are key to ensuring maximum uptime for your critical databases. But these can be complex to organize and perform to avoid impacting users and workloads. Remote DBA services can ensure that health checks and other maintenance and administrative tasks are performed routinely and efficiently. 

Four: Increased scalability

As your business grows or your specific database needs change, you might find you need to augment current database staffing. A remote DBA services arrangement makes it easy to get more bandwidth and/or additional expertise without the extensive cost and time required for a full-time hire. A remote DBA can also support your business to scale up by ensuring that your database can handle increasing amounts of data and transactions.

Five: Improved database security

Information security is a specialty discipline, and database security is a specialty within that specialty. Many DBAs don’t have these hard-to-find skills, which results in critical data being vulnerable to cyber-attack. Your business may also be out of compliance with security requirements in applicable regulations like HIPAA, PCI-DSS, Sarbanes-Oxley (SOX), the Gramm-Leach-Bliley Act (GLBA), etc. A remote DBA service can help improve your database security by applying best practices, including helping you develop strong security policies, patching your database environment, auditing your databases for vulnerabilities with automated tools, and reviewing user and administrative rights and privileges.

Next Steps For Remote DBA Services

Buda Consulting has been a preferred remote DBA services partner to global brands and SMBs for Oracle, SQL Server, MySQL, and other database technologies for over 25 years. Our staff consists of certified database professionals capable of managing the most sophisticated database architectures.  Contact us to talk about your database requirements and how Buda Consulting can help.

Thoughts On Deleting Data – Considerations & Best Practices

Thoughts On Deleting Data – Considerations & Best Practices

In a recent blog post on database maintenance tips, I mentioned that one important facet of cleaning up the database is to remove records that we no longer need — those that don’t contribute value to the applications and users who use the database. This is an important maintenance process, but there are some equally important considerations when thinking about deleting any data from the database. 

These considerations are driven by a few key questions that we need to ask:

Why are we deleting the data? 

Performance?  Disk space cost?  Security? Organization, Simple lack of value? Let’s look at each of these reasons and look for alternatives to deleting the data in order to avoid losing any untapped future value stored in that data.


If we are thinking of removing data as a way to improve performance, can we instead use partitioning, archiving or indexing to achieve adequate performance while preserving the data? Can we tune the most expensive queries to reduce load on the system?  Can we increase resources on the server or move the database to a more powerful server?

Disk space cost

If our purpose is to reduce the cost of disk space to store the data, can we partition the data and archive the older or less-used data to lower cost storage?  Is compression an option on our hardware platform (eg, Oracle’s exadata platform)? Can we remove some indexes that are taking up space but not adding a performance boost?


If we are seeking to remove the data to improve security by reducing the data footprint, can we leave the data there and achieve the level of security that we need by using an encryption scheme, virtual private database (Oracle),  or another tighter access control scheme like label security?

Organization (reducing clutter) 

If we are removing the data because we don’t want to see it —  because the reports, queries, and dropdowns in the application screens are unwieldy, can we tag records as deleted instead of actually removing them and filter queries based on these tags?  Can we create views for each table that filters these records and use synonyms to redirect apps to those views to minimize application changes?  

Lack of value

If we are certain that the data really has no value, let’s get rid of it. This is the data that just sucks energy from the system and gets in the way. But when doing so, let’s be sure to do so subject to the considerations below. Even deleting valueless data can cause problems if we are not careful.

Considerations before deleting anything

Once we are convinced that the data has to go, we have to ask all of the following questions in order to create the proper process for deleting it. 

Interfaces with external systems

Downstream and Upstream systems may break as a result of deleting data. 

Downstream systems may contain supplemental data that will be left dangling if you delete a record from your system and do not also delete it (and the associated supplemental data) from the downstream system. This can cause applications to fail or worse, can cause invalid results to appear on reports.  

Upstream systems may be subject to numerous problems as well. They may re-introduce the same records that you delete, or they may send child records that are associated with records that you deleted, causing interfaces to fail. Worse, without proper logging in your interfaces, errors like this can go undetected.

Of course, this problem can be recursive. Each of the upstream and downstream systems may have upstream and downstream systems of their own having the same potential risks and complications.

Constraints and Dependencies

Are there database constraints with triggers that would result in child data being deleted and do we want this? As we think about whether we want to delete older customer invoices for example, do we want to delete the order history at the individual item level? We may not want order history with respect to a customer after 7 years, but do we want to lose the information about the quantity of each item that was ordered over time?  If we want to keep the item order counts but not the invoices, then we may need to store that data differently in order to be able to delete the invoices.

Or conversely, is there any data that would be left dangling because there are no integrity constraints defined to delete child data?  If we delete old customer records for example, will there be customer address data, demographic data, or customer preference data left behind resulting in data inconsistencies and invalid reporting? This is just one reason why database integrity constraints are so important in a database design, but that is a topic for another post.

All these questions must be applied to the child data as well. If we delete this data, is there dangling parent data that would be left useless or meaningless without its detail and therefore should be deleted as well?

Retention Requirements

Does your organization have any retention or destruction requirements for this data? Better check with legal! Aside from your organization’s own data management policies, numerous regulations specify how long data of different classifications must be retained. For example, as mentioned in this article by USSignal, The Fair Labor Standards Act, the Bank Secrecy Act, the Payment Card Industry Data Security Standard (PCI DSS), the Health Insurance Portability and Accountability Act (HIPAA), and the Federal Information Security Management Act (FISMA), among others, all specify data retention requirements. Be careful to delete any data while you may be required to produce it in the event of a legal action.

The Deletion Process

So we have confirmed that we want to delete a set of data, we have confirmed what ancillary data needs to go with it and in what systems, and we have confirmed that we can legally delete the data.  Now we have to think about how to do it safely. Here are some guidelines that we follow before hitting that delete button.

Script It

All delete commands must be scripted! All delete commands must be scripted! It was worth saying that twice. First, accidentally deleting records is an obviously bad thing, and accidental commands are much more likely when working in a command prompt in SSMS or SQLPlus than when we have carefully crafted the commands and placed them in a script with comments and logging.   

Preview and Approval

A preview of any data to be deleted should be made available to the application/database owner with authority to approve the removal of the data. This can be done by issuing the same command (same criteria) that will be used to delete but as a simple select command. It can be presented in a csv file, spreadsheet, or a live query. And the preview should either detail all of the child and parent data that will be removed, or just the parent records and a written description of the child data that will be deleted along with it.  The approver must be made aware of whether or not the delete process is reversible. This should be approved in writing before the data is actually deleted in the production environment. 

Make it Repeatable

One reason to script the process is so that it is repeatable. The script should be based on criteria such that if additional data that matches that criteria is introduced after the initial delete, we can rerun the script and catch the additional data. This is also very useful if we have the ability to issue the commands in staging or test environments before doing so in production.

Do it on Staging/Test first

It is very possible that after deleting data, the user will realize that they did not account for all of the implications of doing so. Whenever possible, the exact delete process should be done in a staging or test environment before doing so in the production environment, with end user testing before the actual production delete. 

Log it

Finally, we must log the delete. This means actually keeping a record of the records that were deleted. This does not mean keeping all of the data, just a record of a few key fields so the removal of the records can be traced should there be questions later. For example, if old invoices are removed, keep a record of the customer number, the invoice, number, and the date of those records that are removed. This can be done with simple select statements executed prior to the delete command using the same exact criteria. 

Is it worth it?

That all sounds like a lot of work, and it is. But the implications of deleting data are significant and restoring or reproducing deleted data can be extremely difficult and time consuming or even impossible, so a thoughtful, diligent process is required and worth all of that work.

When Should The Database Be Updated?

When Should The Database Be Updated?

Why if it’s not broke don’t fix it does not work for databases (or anywhere in IT for that matter)

One of the hotly debated items among IT professionals is the age-old question,”When should the database be updated?” At Buda Consulting we always like to make sure our clients are running the latest, secured and supported versions of any software in any environment we manage.  This includes software products from Oracle’s database and Microsoft’s SQL Server to PostgreSQL. But we have noticed that this has not always been the case when we come into a client’s company and perform our global product health check.  

In my experience I have worked with DBA’s and System administrators who have always said if it is working we should not touch it and I can understand why some professionals and managers may feel this way.  When your database or application is offline it creates stress as administrators are tasked with getting the service(s) back online as soon as possible.  The idea is if we do not touch anything it should just work without issue but experience shows this is not always the case.  When it comes to databases specifically, not touching a db from time to time can have catastrophic results.  

As DBA’s if we did not look at your database’s tablespace stats, we would never know when your instance was about to run out of space at the tablespace or filesystem/ASM disk group level.  Not noticing this would result in your database eventually not being able to write data which would result in your application/database crashing.  

Another excuse (yes, that is what I call not upgrading your software!) I hear from time to time is new software versions introduce bugs.  That is true but almost all software versions will introduce bugs.  Most bugs are usually outlined in the KNOWN BUGS section of a software release’s readme while others have yet to be discovered.  The part this excuse does not take into account is that new software usually fixes bugs and security exploits that were not patched in the older version of the software. Whenever you are in doubt contact Buda Consulting for a database security assessment.

Let’s determine “When should the database be updated?”

As someone who has worked in both the private and public industries of IT, I have seen the dire consequences of failing to keep your software up to date.  This is a widespread problem in most public sector entities as most do not generate revenue but provide a service for the citizens of said state.  Because money is usually very scarce most IT budgets tend to get trimmed to the detriment of the agency.  I have seen time and time again where a mainframe service was not maintained over the years because the original administrators of the platform either moved on or retired. Because these admins were the ones that implemented the platform, once they left the knowledge of administering and maintaining the platform left with them.  

This caused new staff who did not know about the platform to just “keep the lights on” and not patch the environment in fear of breaking something that was not broken.  Over time the software running the platform moved further away from the latest version of the platform until a direct upgrade path to the new platform was impossible without vendor intervention or consulting services.  Once the vendor is involved you can expect the cost of the upgrade to not be cost effective.  I have seen quotes for upgrade work as high and two (2) million dollars to upgrade mainframe systems that could have easily been avoided had both old and new administrators put forth their best effort to make sure the platform was always running the latest software.  

It is industry best practice, especially when it comes to databases, that moving to a new software version should only be done after the release of the first service pack.  For instance as of the writing of this article Oracle’s latest database software is on version 21c.  Once  service pack one of 21c (21cR1) is released, all companies using 21c base release or older software versions should have started creating an upgrade plan that should be implemented in no less than six months to a year.  Like I explained above, by not keeping your software upgraded to the latest version you put your company at risk of having to spend a lot of money down the line to hire an outside company to come in and perform the upgrade as you are no longer able to easily upgrade from one version to the next.  

So if you are running Oracle Database versions 11g or 12c, it’s time to start planning an upgrade to at least 19c or 21c.  If you are running Microsoft SQL Server 2016 it’s time to start planning an upgrade to at least SQL Server 2017 CU 24 or SQL Server 2019 CU 11.  We cannot stress enough that the old if it’s not broken don’t fix it methodology needs to go away.  In the age of constant security breaches it is very important, now more than ever, to keep your software up to date with the latest patches to make sure you are protected against the worst of the software exploits that are running around the interwebs.

And if you like this article, please share it with your colleagues and subscribe to our blog to get the latest updates. Schedule a 15 minute call with Buda Consulting today.

Database Maintenance Tips: Weeding the Database

Database Maintenance Tips: Weeding the Database

This time of year I spend a lot of time in the garden. Early mornings are peaceful and quiet, and the perfect time to reflect on the garden and clean it up. This is a discipline that I also apply to the databases that I manage. I will describe the process that I follow when tending to the garden, and then talk about how a similar process can help keep your database healthy, functional, and efficient.

How I maintain the garden

I look carefully at each bed, and think about what belongs and what does not. What is contributing to a healthy garden, and what is sucking energy away from what I really want to grow. There are multiple levels to this process:

I start by defining the edges of each bed, making sure that there is a well defined edge where this bed ends, and the lawn begins. Then I remove the weeds. Weeds of course are just plants that I don’t want there. They are not bad things, they just don’t contribute to the overall purpose of the bed, and they suck energy from the plants that do. They also make things look messy and compromise both the maintenance of the bed and enjoyment or utility of it by causing us to think about what belongs and what does not belong each time we look at it or need to work in it.

After we are done with this defining the edge, we drill into each individual plant in the bed. We look for dead blooms or branches that can be removed so they don’t suck energy from the living parts of the plant. We stand back and determine how we want the plant to look and we prune so it does not get messy. Essentially, we are ensuring that each plant plays the part we wish in the overall function of the bed.

As we do this, we ask questions like “what is the function of the bed, is it to screen for privacy, or to provide color, or to absorb wind?” And “what is the function of each plant within the bed?” The answer to these questions determine how tall or wide we let the bushes grow, how close together we let them get, and which ones will be allowed to take the most room.

Finally we lay mulch to minimize the weeds and to conserve the moisture in order to minimize maintenance costs and to protect these valuable plants.

What does this have to do with a database?

Managing a database is similar to managing a garden bed. Let’s look at how the above process is similar to proper and thorough database maintenance.

Defining the edges

This is a critical step for securing and maintaining our databases. There are at least three ways that the edges of our databases get messy (compromised).

User Access

Users of course are given access to our data. Initially this is done with intention. But invariably, someone needs some data for a report, a request is made for access, and the access is granted, often with the intent of the access being temporary.  From then on, data may leak from our database to external systems without real intention. In a well secured database, all access must be identified, documented, approved, and intended.

Power User Data

Often, data must be combined with external data for a specific report. A power user will build a table in our database for that purpose, and a script will be created to keep that “external table” up to date in our database. This external table may grow in unexpected ways because the application owner does not control it. In a well managed database, all data coming into the database must be identified, documented, approved, and intended.

Database links

Links may be set up to enable another application to either feed data to the subject database or to query data from it. These links must be examined on a regular basis to ensure they are still necessary, and that the linked database is as secure as the subject database is.

Removing the weeds

Once the edges are defined, we must look at all database objects and make sure they are contributing to the purpose of the database. We want to remove any tables, stored procedures, or functions that were introduced during tests or trials, any that serve obsolete functions, and any that were created by power users and are no longer needed or were not authorized. This is both for security purposes and to ensure that no resources are being wasted.


Now that we only have valid objects in the database, we apply the same process to the remaining objects. This is a more time consuming and difficult process, but it is important to ensure that the system continues to remain secure and valuable to the organization. This means ensuring that tables only contain data that is relevant and that contribute value. There are two dimensions to this and they are most easily thought of in slightly technical terms: Rows and Columns.

Rows (records)

Are all rows or records in the database valuable and relevant. For the most part, this will be time based. If the system is now 10 years old, do we need all 10 years of data in the database? If not, can we purge some of that old data, or can we archive it so that it is out of the primary database environment in order to improve performance and decrease resource consumption?  Beyond time based concerns, there may be other examples, such as products that are no longer sold for which we store a lot of data, can these be purged?  Or perhaps we spun off an entire division of the company, but the old data remains, taking up space and impacting performance.

Columns (fields)

Are there unnecessary data points stored in any of our tables? In customer data there may be PII data that is not necessary and poses a security risk. In chemical lab data there may be unnecessary test results in a large table that will never be used but that are taking up space in the database and dragging down performance needlessly.

Laying the mulch

After we have cleaned up a garden bed, we lay mulch to reduce weeds, to improve the overall function and appearance of the bed, and to both protect the health of the bed and improve the efficiency of the bed by conserving water.

The equivalent to mulch in database maintenance is a well designed and implemented maintenance plan. This means regularly scheduled health checks (looking for weeds), maintenance jobs (pruning the dead branches),  and validation of backups and disaster recovery processes (Ok, I am struggling for an analogy for that one – maybe ensuring that the sprinkler system works in case there is no rain?).

It’s worth it

We hope you found these database management tips useful. A well managed database is like a beautiful garden. It requires hard work and discipline but the rewards are plentiful. A well maintained garden provides healthy fruits and vegetables and beautiful colors and scents. It attracts songbirds and butterflies, enhances curb appeal, and adds value to your home. A well maintained database provides accurate and timely information to your team to accelerate growth and ensure excellence in execution, and provides critical insights to management that are necessary to keep the business healthy and responsive to changing markets and regulations.

So get out there and bring your clippers!