It will be tested with volunteers who have sought help because they are drawn to illegal images and want to ensure they cannot act on their desire.
Installed on devices such as phones, the app will identify and block harmful images and videos from being displayed.
It's hoped it can help combat "growing demand" for child abuse images.
The Protech project is a collaboration involving organisations from the EU and UK.
The project's app - called Salus - is intended to work in real-time, using artificial intelligence to identify potential child sexual abuse material and stop users from seeing it. It will also use other more conventional techniques to block content.
The Internet Watch Foundation, an organisation that works to find, flag and remove child abuse material, will help to train the AI technology developed by the UK company SafeToNet.
Tom Farrell of SafeToNet, who worked for 19 years in law enforcement, told the BBC the app was not intended to be a tool to report users to the police: "People who are voluntarily looking to stop themselves seeing child sexual abuse material quite clearly wouldn't use such a solution if they believe that it was going to report them to law enforcement."
Volunteers who download the app will be recruited via organisations working with individuals seeking help because they are drawn to online child abuse images.
One such organisation is British charity the Lucy Faithfull Foundation, which operates a helpline for those who fear they may download illegal images and wish to stop. That includes a significant number of people who admit to being paedophiles, some of whom have already been convicted..
The foundation's Donald Findlater, said tools such as the new app could help individuals control their behaviour, adding: "it is a practical aid to people who recognise a vulnerability in themselves".
Members of the Protech project hope it could stem the "growing demand for child sexual abuse material online".
A new high of 30,925 offences that involved the possession and sharing of indecent images of children were committed in the year 2021/2022, according to the NSPCC.
Last year a report by the Police Foundation thinktank said that the volume of online child sexual abuse offences had "simply overwhelmed the ability of law enforcement agencies, internationally, to respond".
Project members who spoke to the BBC suggested that policing alone was not going to stop people downloading images.
Mr Farrell argues that the UK has arrested more individuals for possession of child sexual abuse material than any other country in the world since 2014 and in the process has identified some very serious offenders.
But millions of people still view images
"So arrest isn't going to be the solution. We think we can work on the prevention side and reduce the demand and reduce the accessibility."
Many details of the operation of the app still need to be worked out. No AI is perfect and a balance will need to be struck between over-blocking - which would make legitimate use of a device difficult - and under-blocking - which fails to detect many abuse images.
Mr Farrell says the app will be tested in a "pilot stage" in five countries - Germany, Netherlands, Belgium, the Republic of Ireland and the UK with at least 180 users over an 11-month period.
And experts not involved in the project think the idea has promise.
Professor Belinda Winder of Nottingham Trent University said it was a welcome development that could support people who "want to be helped to resist their unhealthy urges, and who would benefit from this safety net".
As with all new tech tools, the devil would be in the detail and Prof Winder had questions about how it would work in practice but said: "It is a positive step in the right direction."