Artikels

Houd cybercriminelen buiten je ziekenhuisdeuren

Het staat buiten kijf dat de zorg een van de sectoren is waar volop wordt ingezet op technologische ontwikkeling en innovatie. Denk maar aan slimme polsbandjes, mobiele dienstverlening, indoor wayfinding, … Maar innovatieve zorg stelt je netwerk ook voor uitdagingen. Bijvoorbeeld, de hoeveelheid patiëntendata die binnen de zorgsector wordt bewaard, maakt het een belangrijk en bijzonder winstgevend doelwit voor criminelen. Zij willen data maar al te graag bemachtigen om identiteitsfraude te plegen. Om een goede zorg te bieden en efficiënt te werken, moeten patiëntengegevens en andere data te allen tijde veilig zijn. Daarom bespreken we in deze blog hoe je jouw netwerk het best beveiligt tegen cybercrime. Want zowel op vlak van technologie als architectuur heb je als zorgverlener een solide netwerkoplossing nodig die voldoende inzicht biedt.

Met meer gebruikers, meer toestellen, meer data en meer connectiviteit komen ook meer risico’s. DDoS-aanvallen, intrusion attacks, malware, … vormen een reële bedreiging. Ook de impact van ongewenste connecties mag niet worden onderschat. Wat gebeurt er als je bijvoorbeeld een kabel in een printer met netwerkpoort steekt? Je netwerk heeft vandaag geen zicht op die connecties, waardoor iemand zomaar toegang heeft tot je netwerk. Daarnaast werken heel wat ziekenhuismedewerkers met een administratieve functie vandaag thuis. Vanuit hun eigen woonkamer of bureau moeten ze veilig bestanden uitwisselen of erin samenwerken en toegang hebben tot allerlei databases en platformen. Je netwerk wordt vervolgens steeds complexer, waardoor het moeilijker te controleren en beheren valt.


Wie heeft toegang tot je netwerk?

Door de toename van thuiswerkers, toestellen en data blijft de nood aan geavanceerde netwerkbeveiliging en monitoring groeien. Je netwerk 100% secure maken en de beheerslast verlagen, kan met access control. Aruba ClearPass is een handige tool voor netwerktoegangscontrole die eenvoudig te integreren is in je huidige infrastructuur. Je kunt een contextbewust, functiegebaseerd beleid ontwikkelen dat de toegang voor uiteenlopende gebruikersgroepen automatisch regelt. Zo worden gebruikers snel herkend en je kiest zelf tot welke omgeving ze toegang hebben, zonder inzicht en controle te verliezen. Bijvoorbeeld, thuiswerkers laat je met alle platformen en databases werken die ze nodig hebben, terwijl gastgebruikers enkel kunnen surfen op je draadloos netwerk. Aruba ClearPass zorgt er ook voor dat de toegang wordt geblokkeerd wanneer iemand zomaar een kabel in je printer met netwerkpoort steekt. Zo kan geen enkel ongeautoriseerd apparaat aan de haal gaan met jouw gevoelige data of patiëntengegevens én heb je inzicht wie of wat toegang heeft tot jouw netwerkomgeving.


Zet data om naar waardevolle acties

Ook worden IoT-toepassingen vandaag volop ingezet in de zorg. En terecht! Mobiele apps en devices bieden bijvoorbeeld de mogelijkheid om patiënten op afstand te monitoren. Zo wordt het aantal gezette stappen, het calorieënverbruik en de hartslag bijgehouden. Denk ook aan zorgrobots, die activeren en motiveren, of dwaaldetectie voor personen met dementie. IoT-toepassingen helpen de zorg met andere woorden vooruit omdat ze een verbeterde ervaring voor gebruikers en patiënten mogelijk maken.

Maar er is ook een keerzijde aan de medaille. Niet alleen de toegang tot je netwerk, maar ook de grote hoeveelheid ongestructureerde data die IoT-devices genereren, moeten worden aangepakt om jouw netwerkomgeving volledig secure te houden. Het nieuwste cloud-native platform, Aruba ESP (Edge Software Platform) automatiseert, unificeert en beveiligt. Alle data die je IoT-devices en -toepassingen vandaag genereren bevinden zich aan de rand van het netwerk, waar het grootste risico zich vandaag afspeelt. Om ze efficiënt én veilig te kunnen inzetten, moeten ze geanalyseerd en verwerkt worden. Dat kan met een netwerk dat gebruik maakt van AI – een zesde zintuig als het ware. Je netwerk, de gebruikers en de apparatuur worden voortdurend geanalyseerd en omgezet in kennis. Downtime wordt vermeden omdat problemen worden voorspeld en opgelost nog voor ze zich voordoen.

Dit gebeurt allemaal in Aruba ESP: één cloud-native platform, dat de hele infrastructuur van jouw organisatie beveiligt en as-a-service kan worden geleverd. Securitas biedt je een flexibele, managed service omgeving. Zo hoef je je geen zorgen te maken over de kosten of het beheer en kan je zorgverstrekkend personeel efficiënt én met een gerust hart aan de slag!


Maak van jouw netwerk een krachtig fundament in je zorg.

Een krachtige netwerkvoorziening integreert je zorgtoepassingen, stoomt je klaar om te innoveren en verzekert dat zorgverleners efficiënter kunnen samenwerken. Wil je je netwerk sterker maken en beveiligen tegen cybercrime? Of heb je vragen over je netwerkomgeving of toekomstige projecten?

Aarzel niet om ons te contacteren. Securitas helpt je graag je netwerk van de toekomst te realiseren.

24-11-2020 - Aruba

Artikels

INFOGRAPHIC: Smart Cities is niet langer een buzzwoord, maar realiteit

Vandaag werken en leven mensen anders. Ook de overheid volgt op de voet, nu de meeste steden en gemeenten de eerste stappen zetten om een Smart City te worden. De innovatieve projecten houden tal van mogelijkheden in en verleggen werkelijk grenzen.

Dankzij een digitale transformatie kan de telecominfrastructuur van uw stad of gemeente versterkt en verbeterd worden. Denk aan publieke wifi of lokale clouds voor overheidsdiensten. Ook het operationeel beleid van de stad kan een boost ondervinden dankzij intelligente parkeerplaatsen, slimme straatverlichting of efficiëntere afvalophaling. Maar Smart City projecten gaan verder dan dat. Ook de beveiliging van persoonlijke gegevens of e-services en citizen-centric beleid komen aan bod. De toekomst was nog nooit zo dichtbij!


Download hieronder de infographic rond Smart Cities: 

24-06-2020 - HPE

Artikels

Data was nog nooit zo mobiel.
Hoe ga je als gemeente om met een edge-to-cloud wereld?

Innovatieve lokale besturen zetten vandaag in op slimme IoT en Smart City projecten. Alles, van straatverlichting tot afvalophaling of parkeerbeleid, wordt nu slimmer. Door sensors in de hele stad kan vandaag alles gemeten worden, en met die data kunnen we interessante dingen doen.

Maar dat levert ook enkele grote uitdagingen op. Data is nu overal in de stad, en het wordt steeds moeilijker om die data te beheren. Niet alleen omdat er meer is, maar ook omdat veel van die data nooit een datacenter zullen zien. Er wordt namelijk steeds vaker gekozen om data te verwerken daar waar de data ontstaat en gebruikt wordt: aan de edge.

De stap naar de edge.

Vroeger hadden lokale overheden een robuust datacenter waar alle data toestroomde en verwerkt werd. Vandaag doen steeds meer lokale besturen een beroep op clouddiensten én edge compute. Slimme verkeersoplossingen in bijvoorbeeld havengebieden of industrieterreinen draaien nu op data die daar ontstaat én daar verwerkt wordt. Het resultaat: minder latency, eenvoudiger beheer, robuuste systemen met een duidelijk doel.

Smart City projecten staan of vallen met hoe efficiënt data wordt omgezet in inzichten en in concrete maatregelen. Precies daarom is het interessant om heel lokaal outcomegericht te werken en dan vanuit het centrale datacenter de controle te behouden.

Data monetariseren?
Data is goud waard. Niet alleen voor lokale besturen, maar ook voor andere spelers. Logistieke ondernemingen, openbaar vervoer of slimme app-ontwikkelaars bouwen ook oplossingen op data. Maar, om partnerships op te zetten, moet databeschikbaarheid gegarandeerd kunnen worden en moet de security verzekerd zijn.


Een robuuste netwerkinfrastructuur.

Ook connectiviteit speelt hier een grote rol. Data was nog nooit zo mobiel. Met steeds meer geconnecteerde toestellen, gebruikers en beheerders, stijgt ook de behoefte aan bandbreedte, snelheid én beveiliging.

Data moet beschikbaar zijn, maar daarom niet toegankelijk. Een sterk toegangsbeleid is een essentiële schakel in elk Smart City project. Door een rollengebaseerd toegangsbeleid, door device health checks en multifactor authenticatie verzekert u de veiligheid van uw data. 


Waar te beginnen?

Als lokaal bestuur is het niet eenvoudig om door het bos de bomen te zien. IoT en Smart City projecten raken aan heel veel domeinen, en er is uitgebreide expertise nodig om het succes te verzekeren. Het belangrijkste echter, is dat er een degelijke basis ligt. Uw data- en netwerkinfrastructuur vormen de hoeksteen voor vernieuwing.


Tijd voor innovatie in uw stad of gemeente?

Download nu de presentatie van Stephane Lahaye rond EDGE TO CLOUD: 

DOWNLOAD

Nog meer weten over Smart City projecten?

DOWNLOAD WHITEPAPER

Hoe kan Securitas u helpen om de volgende stap te zetten?

Aarzel niet om ons te contacteren met vragen, uitdagingen of projecten.

08-05-2020 - HPE

Artikels

How to build a smart city that will last

One-off smart city projects are rarely designed for the long haul. To build a city that will grow and support its citizens for decades to come, you must identify what really matters and create an architectural framework that supports that vision.

Imagine this scene, 20 years from now: You wake up, open your window, and look out over your city. I can guess that it’s a fantastic place. Note that I said “fantastic,” not “smart.”

From my own future-vision window of 2038, I see Rome. Its bi-millennial history is reflected in every stone. But this future Rome is also a city that works fluidly around me. I live a modern life while being immersed in Rome’s awesome grandeur. I have easy access to services and information. I do not distinguish between what is smart and what isn’t, because it just doesn’t matter. What does matter is that the city and its services provide what I need, when and where I need it.

I expect the projects we think of as “smart” today to turn into something else. Instead of calling out a collection of "smart" features, the city itself will seamlessly evolve in a fluid experience continuously adjusting around me and everyone else. By my expectations, these features will enable cities to better accomplish their mission to provide infrastructure, and to enable business and personal relationships. They should be unnoticeable, an obvious part of what you see around you, the way we take electricity for granted today. 

What smart really means

Current and future smart projects cannot be isolated from their surrounding environments. Smart parking, for example, isn’t a matter of just counting inbound and outbound cars and helping you to find a free lot.  Real smart parking is aware that you are entering or exiting. It interacts with an intelligent traffic system to reduce queues. It informs the restaurant that you are arriving to claim your booked table. It drives your autonomous car to the closest free lot. It charges your credit card and sends the information to your car system for your convenience. It also provides real-time information to improve operations and security in the area.

Moreover, every smart project is unique, depending on the uniqueness of each and every city. The local aspirations, needs, and priorities combine with specific legacy infrastructure and services, organizations, and regulations, defining a unique meaning for a smart project for a specific city and community. Nowadays, a smart city can be a concept (such as Singapore, based on the clear goal of sustainability and livability) or an ambition (like Dubai's aim to be the happiest city in the world). It can also be a way to solve specific problems, or an improvement in the efficiency and experience of public services.

There are plenty of opportunities, and more will be enabled by emerging technologies such as 5G, blockchain, and artificial intelligence. Around these technologies is a vibrant, multidimensional ecosystem embracing traditional players, technology providers, and innovators, with potentially infinite combinations to solve each unique city problem.

Technologies are the core enabler for such transformations, but they are not enough to make something “smart.” It is not difficult to gain additional insights about a service by gathering more digital information from a sea of sensors, or to open up relationships with citizens by providing digital access to the public services. Rather, the "smart" label should be assigned only when technology enables a foundational change, such as dramatically reducing water or power losses, transforming an insecure area into a public garden for families, or improving accessibility to people with disabilities. With a proper vision, a city can become more efficient and sustainable, and citizens' daily experiences will improve.

Are you ready for the IoT?  Here’s a framework that will get you started. 

Download the idc whitepaper

Why smart projects fail or fade away

We all can think of examples of technology uses that once impressed us but turned into dust. In the same way, how many smart city projects are likely to survive until 2038? In city terms, I've watched a number of projects disappear, including traffic monitoring systems that collected data that didn’t make a difference, a CCTV security network that didn’t increase a venue’s security, and a public Wi-Fi system that was eventually abandoned. (I’d love to hear about your own experiences.)

When projects are designed to address a point problem, well, it isn’t wrong, but implementation usually takes one or two years or even more. Then the solution lasts a year or two before the technology becomes obsolete or a city’s evolving need turns investments elsewhere. And that's with regard only to technology features. Expect at least four generations of technologies between now and that 20-year point, and 10 or more if Moore’s law is applicable. Other factors working against urban planners are changes in city population, economics, and demography. Regulations also evolve as a result.

A city’s infrastructure, services, topology, and population, on the other hand, can last and evolve over a decade or more, which means the innovation we introduce should last as long. 

Think in terms of city services and how they can change as the city does. For example, consider what a smart garden might look like. You can scatter around Wi-Fi access points and implement a good mobile app today. Or alternatively, you can envision a step-by-step evolution over several years to transform the garden experience into one that you can live in, making it more immersive, personal, and relevant.

Take that a step further and consider something more personal: your house. A house has many similarities to a city. It is designed to reflect you and the experience you want to live. It’s sustainable, but it continuously evolves. It’s intimate, and reflects your dreams, aspirations, and needs. A house is alive. Pieces of furniture come and go, parts are broken and replaced, families grow and shrink. You change, and you expect your house to change with you. However, if you don’t pay attention to how your house is evolving, you will face a mess and be disappointed. 

Now, scale that to the city level and project it over the next 20 years.

You can take steps to ensure that your house matches your vision. Engage an architect who can translate your ideas into a project, considering all of the components and constraints and looking at the outcome, even going beyond your initial ideas (or do it yourself). You might also weigh the advantages of doing minor updates in the short term versus a complete renovation that might have a better outcome. Why not do that at the city level?

Houses and city modernization must also consider specific constraints, including history, city planning, and architecture. You may need to consider elements of industrial archaeology—for example, in Rome, your decisions may be forced by the presence of the Coliseum. City designers have to work around that. Or you may have the full freedom of changing virtually everything, as in the living SimCity of Dubai. A multidisciplinary approach led by the city’s administration has to navigate those kinds of issues. 

Smart city success factors

There are three key success factors that recur in successful smart city projects.

First, create your specific strategy. No city in the world is the same, so despite having similar definitions (smart traffic, smart lightning, smart water, etc.), no smart project or service will be identical. When you scratch the surface, you see different goals, constraints, processes, technologies. So define the frame; it will likely stay consistent for a decade or two as changes are limited by a city’s momentum.

Second, you need the right components. Shortcuts, including fancy or cheap solutions, do not work in the mid term. However, open standards do work, as any city infrastructure has demonstrated in recent decades. Technology matters, therefore be smart and protect your investments by leveraging the evolution of standardsrather than potentially ephemeral proprietary capabilities.

Third, what makes a city smart, or fantastic, is the perceived outcome. A customer of mine summarized this concept: “A city is smart if it helps you.” Components or technology changes do not make a city smarter. And no one can deliver a full city outcome working on its own.

What made your 2038 view so fantastic?

Go back to your window to the future and look at your fantastic city in 2038 again: You can now capture what made it so great. You recognize a clear evolution of patterns that systematically changed all parties toward a common design. The local community flourished by leveraging the development opportunities provided by the modernized infrastructures and services. Pervasive efficiency is evident. You live in a fantastic place, not just a smart one.

Smart cities: Lessons for leaders

  •    Identify what really matters for your city in the next 2, 5, 10, 20, 30 years.
  •    Understand the unique frame of your city and what its future should be.
  •    Capture short-term wins that are aligned with and functional to mid- and long-term evolution.
  •    Engage the right ecosystem of partners to solve a city problem or enhance a city service.
  •    Strategically adopt open standard technologies coherently, with a clear architectural frame.

Lorenzo Gonzales - Strategist, Global Technology and Presales, Hewlett Packard Enterprise

26-06-2019 - HPE

Artikels

5 reasons why data backup strategies fail

On-prem data backup solutions can prove costly, and public cloud solutions may not meet regulatory requirements. Learn how establishing an in-house, consumption-based data backup model can give you the best of both worlds.

Backup isn’t the most glamorous subject in the IT universe, but it’s critical to your business. Lose your data and your business could be lost, too.

Given the importance of the task, running backups on premises has been the go-to approach for most enterprises. The benefits of keeping data behind a firewall are clear from a security and privacy perspective, but there’s a downside too: You’re saddled with large upfront costs for equipment, you need trained staff to run your backups, and it’s hard to scale up to meet new demands quickly. Capacity planning is challenging, too, because you must navigate between the risks of costly overprovisioning on the one hand and running out of storage on the other.

Instead, you might decide to mitigate some of these problems by using the cloud, but the loss of control means there are privacy and security issues. And an unforeseen network problem could stop you from accessing your backup data.

Thankfully, there’s a “best of both worlds” option that offers all the advantages of the cloud combined with an on-premises approach that secures and protects sensitive data and helps you meet various regional data compliance regulations. This is made possible by a consumption-based model for IT, where a vendor offers enterprises a billing, monitoring, and consumption model across their hybrid IT platforms. One service offered in this manner is backup: A vendor can provide an end-to-end service—including the necessary equipment, applications, processes, and management—and deploy it on the client’s premises.

The benefits to your business are significant, including the ability to work with a provider on a payment model that focuses your resources on the most critical aspects of your backup plan. You’ll also free up your staff for more creative, revenue-positive work instead of the drudgery of backup chores, enjoy a simpler approach to costs and billing, and have the comfort of knowing that your provider is closely monitoring your data and keeping you on track as you grow and adopt new workloads. It’s also worth noting the value of having a solid backup plan: Reducing the loss of productive time due to outages can amount to as much as $1.52 million per organization per year.

A blueprint for achieving consumption-based IT with on-premises infrastructure. Learn more

Here are five reasons why traditional backup strategies fail:

1. Routine tasks take up too much time 

Backup is as demanding as it is important. It involves planning, management, monitoring, and troubleshooting. This is a headache you probably don’t need, given that most IT departments are already overstretched. According to IDC, IT staff spends too much time handling routine tasks such as installing and deploying hardware, monitoring systems, and patching and updating software. Backup, too, is a task that demands a lot of time. In fact, over a given week, IT admin and operations staff spend just 14.5 percent of their time on innovation and new projects that can drive the business forward.

This is why more enterprises are turning to a backup solution based on a consumption-based model instead of trying to handle it in-house. It takes care of the mundane IT tasks on your plate, such as backup, and frees up your IT staff to take on the projects that can really make a difference to your core business. And it can be cost-effective, too. According to IDC’s analysis, the time savings and productivity gains of this approach will be worth $29,037 per 100 users per year.  

The IT department of the past was reactive to the business, providing support services such as patches and updates, or adding new functionality to the company’s operations. Now, at a time when the IT infrastructure is becoming increasingly vital to support the enterprise’s growth, IT organizations must become innovative and proactive, staying ahead of the company’s business needs. That means providing services outside traditional support roles. 

2. Control is lost 

If you’re backing up your data mostly in the public cloud environment and not on premises, your data is more vulnerable. A hospital, for example, wouldn’t back up patient information to a public cloud provider, and often companies want to give extra protection to sensitive information such as customer data or intellectual property. A network issue could suddenly block off access to your backup data, leaving you in limbo for hours. With the public cloud, you can’t completely put to rest nagging worries about data privacy, security, and protection.

The benefit of an on-premises IT consumption model is that you can keep your apps and data secure in your own environment while offloading the heavy lifting to someone else. And you only pay for the data that you back up, eliminating upfront hardware costs. With IT teams facing pressure to minimize risk and cost to the enterprise while delivering IT services quickly, this approach to backup is necessary. A more secure environment mitigates risk which means fewer disruptions.

3. Too much is spent on Capex

It’s a common scenario for IT companies: A new IT project, such as setting up a backup system, means large investments in hardware and software and spending months getting the new system up and running. In the end, you may have a lot of expensive, shiny new hardware sitting there waiting for applications. It’s a costly capital expenditure that represents an old way of looking at IT.

There’s a better way to do this. A consumption-based approach allows business unit leaders to manage their IT costs and align them to revenue and enjoy more flexibility. Under this scenario, the business works with a vendor to set up its backup capabilities on premises, and it is billed for what it consumes each month. The outside IT provider monitors backups to ensure that more capacity is on hand when needed. The company agrees to a minimum level of use but can add more backup capability or less, according to the company’s IT needs. This end-to-end backup solution will typically include back-end storage and management servers, and the software and operating systems for the backup solution.

Another benefit is the IT provider ensures the equipment you have on premises meets your backup needs. Research from IDC has found that some 45 percent of the IT equipment used by a typical large enterprise is more than 5 years old, meaning businesses are paying to manage older, less efficient IT, while modern systems that run on more energy-efficient, newer technology have far better management capabilities and security. The cost and risk of retaining older technology outweighs the benefits of newer technology, and your IT provider will ensure you have the technology you need.

There is also a benefit to IT departments that often feel the pressure to purchase the very latest hardware. A backup-as-a-service provider will install hardware on premises for you and, as long as it meets your requirements for backup, will ensure that you’re not paying for unnecessary upgrades. After all, if most of your road trips are to and from the local grocery store, why purchase a Lexus when a Toyota does the job perfectly well?

4. Capacity planning is a guessing game

Backup capacity planning is a thorny issue for many enterprises. Many now simply estimate how much they will need for the next few years and then purchase the amount of IT equipment that will cover that demand. An added complication is you need to monitor your environment to stay ahead of the organization’s expansion needs or anticipate sudden demand peaks that could overwhelm your current capacity. No wonder few enterprises successfully match capacity and demand, tending instead to take on more storage than they need.

The good news is that under a consumption IT approach, you don’t need to worry about running out of capacity. Generally, an IT provider delivering backup as a service will include a local buffer of capacity that you can dip into as needed. And the provider will constantly monitor your backups to ensure you have enough capacity.

For example, say you back up once each weekday but later decide to also back up on the weekend. A provider that is monitoring your environment can detect that change and proactively add more capacity. And predictive analytics can forecast how much you may need in the months and years ahead—no disruptions to your business and no lengthy provisioning cycles. This is likely music to the ears of IT managers, given that a lack of capacity can have a harmful effect on a company. According to 451 Research, 50 percent of enterprises have suffered downtime due to poor capacity planning.

5. Billing is too complicated

The calculations and estimates that go into tracking cloud usage—monitoring and metering how much data you’re sending over and how often—are often complicated and time-consuming.

When an IT provider takes on this challenge for you, from a billing standpoint, it couldn’t be simpler. For example, if a customer has three servers, a provider could take the biggest single backup for each one during the billing period and aggregate them. So if the first server’s largest backup was 10 terabytes of data, the second host 15 TB, and the third 22 TB, the final tally of billed data would be 47 TB.

This is a simplified billing structure that frees you up from the fear of overpaying as a result of the overprovisioning that often occurs with a traditional on-premises backup solution.

A provider can even handle individual backups by different business units. Two separate departments in an organization might have wanted to keep their backups separate in the past, but both had double the capacity they needed. With a consumption-based model, they are more apt to share, minimizing costs, because at the end of the month, they are paying only for what they use.

Once an organization decides on the amount of backup it needs, it needn’t fret about overcapacity or not having enough. Like your water bill, you pay it, but you don’t need to worry about it. An automotive company, for example, can focus on building great cars instead of worrying about paying for storage devices.

Adopting a pay-as-you-go approach allows enterprises to manage their costs and align them to revenue, while still enjoying incredible levels of flexibility. A business gets billed monthly and pays only for what it consumes. An outside IT provider monitors the company’s usage and adds more capacity as it’s needed.

The approach is very attractive to most organizations, and it’s especially attractive to your CFO, because the model can often be treated as an ongoing operating expense rather than a large upfront capital investment.

If you recognize any of these five limiting approaches to your backup, it may be time to consider a consumption-based strategy. Working with a trusted IT provider to establish a consumption-based approach to IT will give you a great on-premises solution and allow you to avoid investing heavily in an in-house IT infrastructure, while still enjoying the advantages of strong data security and recovery.

11-04-2018 - HPE

Ontvang updates over ICT in publieke organisaties

Regelmatig landt deze Securitas update in uw mailbox. Met gegarandeerd relevante informatie!

Uitschrijven kan uiteraard op elk moment

En volg ons op: