A patient-centric approach means putting the needs, rights, and concerns of patients and their families at the center of healthcare decisions and processes. In pediatric oncology, using AI means healthcare providers and technology makers must include patients, parents, survivors, and advocacy groups in designing and using AI tools. These AI tools can help with diagnosis, predicting outcomes, creating treatment plans, and managing data. Still, real patient experiences and ethical issues must be considered to avoid treating patients like data points or risking privacy and consent problems.
The European project UNICA4EU shows how patient-centric methods work. It was led by CCI Europe, the largest organization of childhood cancer parents and survivors in Europe. The project gathered data from 332 people—patients, parents, and survivors—speaking nine languages. Through surveys and group talks, they found main worries and preferences about AI in childhood cancer care.
The issues they found are similar to concerns seen in the U.S., which has a large and diverse group of children with cancer. U.S. healthcare leaders must understand and respond to these to build trust with families, make care better, and follow laws and ethics.
Using patient-centered AI is very important in the U.S. The healthcare system is advanced but has many legal, cultural, and social challenges. Pediatric cancer care often involves many specialists, advanced tests, and long follow-up. Medical administrators and IT managers face many challenges when adding AI.
U.S. systems can learn from Europe by creating:
AI can help make pediatric oncology work better without disturbing daily work. AI can help with front-office tasks and make communication between families and care teams faster. For example, Simbo AI uses AI to answer calls, schedule appointments, and triage patients. This reduces the work for staff and helps families get quick answers. Quick info sharing is very important in treating children with cancer.
Other examples where AI helps include:
For administrators and IT managers, it is important to add AI carefully so it helps but does not replace human care. Staff should be trained and know how to explain AI use clearly.
One big lesson from UNICA4EU research is that trust grows from being clear and open. When families understand how AI works and see that their data and rights are protected, they trust AI more. This is critical in pediatric oncology because parents worry a lot, and children may not fully understand or give consent themselves.
Practice managers should create:
European groups like CCI Europe show how patient advocacy groups can connect medical research and families. In the U.S., similar pediatric cancer advocacy groups can help with AI policies and education.
Including these groups helps ensure:
Adding AI to pediatric oncology care is not simple. U.S. healthcare faces issues like:
Solving these problems takes teamwork from healthcare leaders, IT teams, finance staff, and patient voices.
Caring for children with cancer is very complex. It needs medical skill and attention to psychological, social, and ethical needs. AI has the potential to improve diagnosis, personalize treatment, and make operations smoother. But success depends on keeping patients and families at the center and respecting their concerns and rights.
European research like UNICA4EU and groups such as CCI Europe offer good examples of how to include patients, parents, and survivors in AI decision-making. U.S. pediatric oncology services can use these lessons to create rules about informed consent, data ownership, privacy, and ongoing communication.
For administrators, practice owners, and IT managers in the U.S., implementing AI is not just technical. It is also an ethical and practical challenge. Following patient-centered principles helps keep trust, meets legal rules, and improves care for some of the most vulnerable patients.
UNICA4EU focuses on a patient-centric approach to integrate AI in childhood cancer care pathways, emphasizing evidence-based patient advocacy to build trust while safeguarding patients’ fundamental rights.
CCI Europe, the largest pan-European childhood cancer parents’ and survivors’ organization, led this task, representing 63 member organizations across 34 countries.
A survey was conducted, translated into nine European languages, gathering responses from 332 individuals, supplemented by focus group discussions with diverse participants including parents, survivors, and bereaved parents.
The areas of interest were data anonymization and protection, data ownership, data withdrawal, ethical concerns regarding data use, data types, and informed consents.
Patient advocacy ensures that trust is built by protecting patients’ rights, guiding ethical data governance structures, and emphasizing transparency in data sharing, access, and usage policies.
The study highlights the need for strong data anonymization and protection measures to safeguard the privacy of pediatric oncology patients involved in AI data processing.
Inclusion of these stakeholders ensured diverse perspectives on ethical concerns and data usage, reinforcing the importance of respect and sensitivity toward affected families in AI governance.
Stakeholders emphasized clear definitions of data ownership to empower patients and families, promoting control over their personal data and ensuring transparency in its use.
Informed consent is considered critical, requiring clear communication on data use, patient rights, and potential AI outcomes to maintain ethical standards and patient autonomy.
Recommendations focus on transparent AI data governance, prioritizing patient rights, ethical data management, secure data sharing frameworks, and ongoing patient and parent engagement in decision-making.