At Our Time, we believe privacy isn’t just about ticking a box. It’s about trust. It’s about people. And most of all, it’s about doing what’s right, even when no one’s looking.
That’s why we’re proud to have Rebecca Ruddock as our Head of Ethical AI and Impact. Rebecca’s role is more than just overseeing data and compliance. She makes sure that everything we build, from smart tools that reduce admin to predictive insights that help care teams thrive, is rooted in ethics, transparency, and human values.
Let’s take a closer look at what that really means.

What is Data Privacy, and Why Does It Matter?
Think of data privacy like looking after someone’s diary. It’s personal, it’s private, and you only open it with their permission. In the digital world, our “diary” might include where we live, when we work, or even how we feel at work. When companies collect this kind of data, they have a responsibility to keep it safe, explain why they need it, and never misuse it.
That’s what data privacy is all about, giving people control over their own information.
In the world of social care, this matters even more. Care workers and managers handle sensitive information every day. They need to know that their own data, and the data of those they support, is protected and respected. That’s where we come in.
Rebecca’s Role: Championing Ethics and Privacy
Rebecca is at the heart of ensuring that Our Time not only meets legal standards, but exceeds them by embedding ethics and privacy into everything we do. As our Head of Ethical AI and Impact, Rebecca leads the charge to make sure Our Time lives up to its values. Her role spans three key areas:
- Ethical AI: Rebecca ensures the algorithms we use are fair, transparent, and free from bias. She works to make sure that the AI isn’t just smart, but responsible.
- Privacy and Data Protection: Rebecca plays a critical role in building systems that only collect the data we truly need and protects it with the highest level of security. She ensures our users’ information is kept safe and only used for its intended purpose.
- Impact and Inclusion: Rebecca is committed to making sure our technology works for everyone. She champions inclusivity, focusing on making sure that the tools we build are designed to help those who are often overlooked, especially in the social care sector.
Rebecca works across the entire company, collaborating with teams from product and design to operations and client partnerships. She constantly asks the tough questions:
- Do we need this data?
- How can we protect it better?
- Are we explaining our practices in a clear and honest way?
- Is this tool making life easier, fairer, and more inclusive for all users?
Thanks to Rebecca’s leadership, we’re not just using AI, we’re using it in an ethical way that supports people, respects their privacy, and builds trust across our platform.

Our Five-Phase Ethical AI and Data Approach
We’ve built a five-phase approach to ethical AI and data privacy. These are the guiding steps that help us stay transparent, responsible, and user-centred:
1. Discovery
We begin by identifying the real needs of our users. We ask: What are the challenges they face? What would actually help?
2. Design
We embed privacy and ethics into our design from day one, this is called Privacy by Design. No retrofitting, no shortcuts.
3. Development
During this stage, we build and test our AI systems with fairness, transparency, and usability in mind. We reduce bias and keep users in the loop.
4. Deployment
We only roll out features once we’re confident they’re safe, clear, and beneficial. We also explain how they work in plain English.
5. Feedback & Improvement
We stay open to feedback and continuously improve. If something’s not right, we fix it. Users are always part of the conversation.
Why This Matters for Care Workers and Managers
If you work in care, you already have enough to think about, shifts, staff shortages, last-minute changes, and lots of paperwork. You shouldn’t have to worry about whether your data is being misused or if your information is being tracked without your say-so.
With Our Time, you can trust that:
- We only collect what we need.
- We’re open about how we use it.
- We never sell your data.
- We protect it with strong security.
- We include users in the conversation.
Rebecca’s work means that behind every smart insight or helpful feature in our platform, there’s a solid foundation of ethics and care.

How We Go Beyond the Basics
Lots of companies follow the law. That’s good, but it’s not always enough.
At Our Time, we want to lead by example. That’s why we:
- Use Privacy by Design: Privacy isn’t an afterthought, it’s part of how we build from the start.
- Follow the UK GDPR and ICO principles, and go beyond when we can.
- Keep things easy to understand, with no jargon or confusing terms.
- Make sure our AI is explainable, so you know why it’s making a suggestion, not just what it says.
Rebecca also runs regular ethics reviews to make sure we’re staying on the right path. She works with external experts to check our approach and helps train our whole team, from engineers to client leads, so we all understand how to use data responsibly. It’s about building confidence, not just compliance.
We’re not just following the rules; we’re trying to raise the bar.
A Culture of Care, Not Just Code
Our Time was built to help people spend more time on the things that matter. That starts with trust.
By treating privacy and ethics as core parts of our platform, not just legal requirements, we’re showing care teams that we’re on their side. Rebecca’s leadership helps us make sure that AI supports people, not the other way around.
We know that trust is earned, not given. And we’re working hard every day to earn it.

Ready to Learn More?
Looking to bring ethical tech to your care organisation?
📅 Book a discovery call today!
Got thoughts about data privacy or ethical AI? We’d love to hear from you.
📩 Contact us, your feedback shapes our future.