Getting the below also when uploading some data. was there a fix for this? Systen.Object[]
Stating to see the below message as it is uploading to PowerBI. any Idea what is causing this? + ~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~ ~~~~~~~~~~~~~~~ ~~~~~~~~~~~~~~~ ~~~~~ + CategoryInfo : InvalidOperatio n: (System.Net.Htt pWebRequest:Htt pWebRequest) [Invoke-RestMet hod], WebException + FullyQualifiedE rrorId : WebCmdletWebRes ponseException, Microsoft.Power Shell.Commands. InvokeRestMetho dCommand Invoke-RestMeth od : {"error":{"mess age":"The request was blocked by KeyBlocker "}} At D:\My Projects\DmarcR eports\Create-D MARCreport.ps1: 1008 char:5 + Invoke-RestMeth od -Method Post -Uri "$endpoint" -Body (ConvertTo- ... Thanks in Advanced.. + ~~~~~~~~~~~~~~~ ~~~~~~~~~~~~~~~ ~~~~~~~~~~~~~~~ ~~~~~~~~~~~~~~~ ~~~~~
Hi Martijn, Great script! After one week of perfectly working i started getting error msg for file format not supported. when logged on to the mail account collecting the reports i saw all gmail reports still in inbox. Did Google changes something? Regards, Kaloyan
Thanks for all your effort Martijn, really really helpfull! I am running in to three errors when running the script, I cannot get rid off. The first: At C:\dmarcworkfolder\Create-DMAR Creport.ps1:170 0 char:27 + $xml += [xml](Get-Conte nt -Path $item.fullname) + ~~~~~~~~~~~~~~~ ~~~~~~~~~~~~~~~ ~~ + CategoryInfo : ObjectNotFound: (System.String[ ]:String[]) [Get-Content], Exception + FullyQualifiedE rrorId : ItemNotFound,Mi crosoft.PowerSh ell.Commands.Ge tContentCommand Attempted to divide by zero. The [Content_Types] .xml is present at the mentioned location though, and filled with XML data. Do you know where the problem lies?
Hi Jeroen, Would you be willing to send over your data folder so I can take a look at the data. contact at tech-savvy.nl
Hi Martijn thank you for the script! I am receiving a connection test failed when running the script. System.Management.Automation.M ethodInvocation Exception: Exception calling "AutodiscoverUr l" with "2" argument(s): "The Autodiscover service couldn't be located." ---> Microsoft.Excha nge.WebServices .Data.Autodisco verLocalExcepti on: The Autodiscover service couldn't be located. We are in O365. Should the autodiscover be bypassed.
The script runs great. Had to enable the shared mailbox account. Thanks
if you don't have autodiscover properly implemented you can use the bypass switch for o365. also the account you use needs either full permissions on the mailbox or you need to enable the shared mailbox in O365 so it becomes a normal user. (Note you have to assign a license to the mailbox if you do this)
Thanks again Martijn! The reports are running great. Another stumbling block for me is the powerBI switch. I am using the switch and running the script as is. Everything processes and the reports are generated but the data never uploads to PowerBi. I did create the dataset per your instructions and added the push url to the script. Am I missing a step somewhere?
do you get a error and did you create the powerBI with the proper settings note the numbers for longitude and latitude and the history slidebar. if you don't see any errors the data should be in powerbi. Do note that data is only send once for each item so if you are testing with the same reports every time remove the XML and csv files from the working folders as it uses those to make sure data is not sent twice.
I tried this again after removing the XML and csv files from the working folders. I checked the settings in powerBI and they set correctly with the numbers for latitude and longitude. I am receiving the following error at the end of the script run. Compare-Object : Cannot bind argument to parameter 'ReferenceObject' because it is null. At C:\Create-DMARC report.ps1:2008 char:51 + ... $newrecords = Compare-Object -ReferenceObjec t $sourceobject -Differ ... + ~~~~~~~~~~~~~ + CategoryInfo : InvalidData: (:) [Compare-Object ], ParameterBind ingValidationEx ception + FullyQualifiedE rrorId : ParameterArgume ntValidationErr orNullNotAllowe d,M icrosoft.PowerS hell.Commands.C ompareObjectCom mand
Thanks @Martjin for the script. I'm getting the same error as bebalint. How to solve this @bebalint?
are you maybe testing I found while support a other user the kept testing with te same data while powerBI upload was enabled. while powerBI upload is enabled the script deduplicates the data to prevent duplicate upload. Still planning to release a separate PowerBI only module though but need some time for that. If you want me to have a look at your case ZIP the working folder and mail it to contact@tech-savvy.nl and I will see if I can check it tomorrow.
Hello! first thank you for the amazing script/tool for dmarc analysis, very powerful! Occasionally I see "System.Object[]" as a result type under the DKIM checks. Where you normally would see Pass/fail. Main report: DKIM results DKIM result number of records pass 189 System.Object[] 2 Then the same variable broken out into the top 15 sources for that result type, and it shows the two IPs: Top 15 sources for object System.Object[] So at least the "System.Object[ ]" is consistent with the counts and results! Looking at the raw XML reports, the IPs listed as "System.Object[ ]" should be "Pass" instead of "System.Object[ ]" Let me know if there's something I can do to help get this isolated. Using script version 2.3 from this site, with Microsoft.Excha nge.WebServices .2.2, running on Windows 10 Ent x64 1803. The target mailbox is a shared mailbox, using credentials of a user delegated full access to the shared mailbox. In general, the report is perfect, but sometimes get this small bug on a small number of objects. Thanks for any help you can offer!
Here is an example XML record that is giving the script trouble: (ips and domain names changed for privacy). Thanks for any insight on this! <record> <row> <source_ip>1.2.3.4</source_ip> <count>1</count > <policy_evaluat ed> <disposition>no ne</disposition > <dkim>pass</dki m> <spf>fail</spf> </policy_evalua ted> </row> <identifiers> <header_from>my actualemaildoma in.com</header_ from> </identifiers> <auth_results> <dkim> <domain>strange officetenant.on microsoft.com</ domain> <result>pass</r esult> <selector>selec tor1-strangeoff icetenant-COM</ selector> </dkim> <dkim> <domain>myactua lemaildomain.co m</domain> <result>pass</r esult> <selector>selec tor1</selector> </dkim> <spf> <domain>strange officetenant.co m</domain> <result>pass</r esult> </spf> </auth_results> </record>
Looks like two unique dkim results are presented in this particular xml report that is causing the issue. Using the a few of the key lines found in the 2.3 script to recreate the issue in the console: $bob=[xml](Get-Content -Path ".\google.com!m yactualemaildom ain.com!1543881 600!1543967999. xml") $temptable = new-object psobject $temptable | Add-Member -Type NoteProperty -Name "dkimresult" -Value $bob.feedback.r ecord.auth_resu lts.dkim.result $temptable | Add-Member -Type NoteProperty -Name "spfresult" -Value $bob.feedback.r ecord.auth_resu lts.spf.result Then looking at the results in a few different ways exposes the issue: Write-Output $temptable Results in this output: dkimresult spfresult ---------- --------- {pass, pass} pass write-host $temptable results in this output: @{dkimresult=Sy stem.Object[]; spfresult=pass} Looks like two entries are in an array when the export should only have a single string for each record. Not sure a good fix for this, or if google.com's report is an invalid format. But at least the issue is able to be seen here with this specific xml report. Hope this helps get to the bottom of the issue!
Thx for your response. I will look into it. In basic term it means it encounterd a array where it would normaly encounter a string
I can confirm with peacepenguin that the two DKIM entries under auth_results is what is tripping up the script. You can see Snips from three XML files (Client's domain sanitized): <auth_results> <dkim> <domain>siuecougars.onmicrosof t.com</domain> <result>pass</r esult> <selector>selec tor1-siue-edu</ selector> </dkim> <dkim> <domain>EXAMPLE .COM</domain> <result>pass</r esult> <selector>selec tor1</selector> </dkim> <spf> <domain>siue.ed u</domain> <result>pass</r esult> </spf> </auth_results> ------------- <auth_results> <dkim> <domain>xynage. io</domain> <result>pass</r esult> <selector>defau lt</selector> </dkim> <dkim> <domain>EXAMPLE .COM</domain> <result>pass</r esult> <selector>selec tor1</selector> </dkim> <spf> <domain>dev.cre dencys.net</dom ain> <result>softfai l</result> </spf> </auth_results> ------------ <auth_results> <dkim> <domain>flcheal th.onmicrosoft. com</domain> <result>pass</r esult> <selector>selec tor1-flchealth- org</selector> </dkim> <dkim> <domain>EXAMPLE .COM</domain> <result>fail</r esult> <selector>selec tor1</selector> </dkim> <spf> <domain>flcheal th.org</domain> <result>pass</r esult> </spf> </auth_results>
Thanks JRBlood for confirming. I don't have any insights to the double DKIM wrapper, but interesting to see that both these issues are related to ".onmicrosoft.com" domains. Perhaps there is something odd about Office 365's DKIM implementation? I only see this issue on a few reports, so perhaps its a type of spoofing that targets Office 365 domains?
These are double signed probably due to a forward or by a MTA in the middle. How big is this issue for you atm.
what would the sleep code be to add in for the geolocation? once the script runs and it gets to the max it just sleeps forever. had it run for 2 days and it was still sleeping after every 70 minutes. not sure where to put the sleep interval. second, also had some outputs about test-path but didn't see anywhere this needing to be configured. should i ignore those errors as after the script ran without geolocation data was not uploaded to the powerbi api.
Hi Takers14, This is a know bug atm, It is fixed already in the 2.2 Alpha. if you send me a email I can send you the 2.2a code as it is still in testing before release.
For the second issue I would need more data if you could zip the work directory and sent it over to me I can investigate. if the error is system.object[] than this is a know issue to and is also fixed in the 2.2a code
Martijn, thank you very much. i have emailed you at the contact@tech-savy.nl address. Appreciate your in this area!
Hi Martijn, just wanted to check if you received my email regarding the alpha 2.2 version for testing? Thanks
Martijn, thanks for the update. Sleep interval is fixed now. However, still not getting any data in powerbi api. The push URL is entered correctly as well. Odd item as well on the last refresh time is showing an old date and not updating. below is last error before completing script: Mode LastWriteTime Length Name ---- ------------- ------ ---- d----- 6/26/2018 4:58 PM report Test-Path : Cannot bind argument to parameter 'Path' because it is null. At C:\Create-DMARCreport\V2.2\V2. 2\Create-DMARCr eport.ps1:2010 char:25 + if (test-path -path $sourcefile.ful lname) + ~~~~~~~~~~~~~~~ ~~~~~ + CategoryInfo : InvalidData: (:) [Test-Path], ParameterBindin gValidationExce ption + FullyQualifiedE rrorId : ParameterArgume ntValidationErr orNullNotAllowe d,Microsoft.Pow erShell.Command s.TestPathComma nd Compare-Object : Cannot bind argument to parameter 'ReferenceObjec t' because it is null. At C:\Create-DMARC report\V2.2\V2. 2\Create-DMARCr eport.ps1:2018 char:51 + ... $newrecords = Compare-Object -ReferenceObjec t $sourceobject -Differ ... + ~~~~~~~~~~~~~ + CategoryInfo : InvalidData: (:) [Compare-Object ], ParameterBindin gValidationExce ption + FullyQualifiedE rrorId : ParameterArgume ntValidationErr orNullNotAllowe d,Microsoft.Pow erShell.Command s.CompareObject Command
I am working on a 3.0 where we separate the Power BI from the HTML reports. however don't expect release pretty soon as I am building a new tool atm. If you want I could take a look at your configuration but that would mean a remote session
Martijn, That is something i am all for! let me know how you would like to proceed and we can schedule a time to do the remote session.
You should update to version 2.3 if you have issues with PowerBI , Due to a deprecated GEO API the data offerd to powerBI was not complete and caused issues. The API I used has been deprecated somewhere in the last months. If you still have issues please send me a email and I see what I can do to help you with a remote session. contact@tech-savvy.nl
Online there is a tool available that is called MaiReport. See www.mailreport.eu
Hello, Great tool. Just few ideas of improvement an a question: - What about adding a script parameter to directly enter credential stored as a crypted XML object ( a -Credential as you can found on some native povershell command or 3rd part scripts). You will have one parameter instead of 2 or be able to select the way to authenticate. - Actually your to need to connect to an user mailbox so for Office 365 people we need to have a licence for this account. I have tested with a Shared mailbox connexion but was not able to deal with. Should be great if it work with shared mailbox based on delegation of user who authenticate or using impersonation. Finally I would like to use your PowerBI part but you doesn't provide any PBIX file. Can you add more informations to how do deal with PowerBI in a future release? Regards Vincent VALENTIN
Hi Vivalentin, Thx for the additional feature request but on the password credential, as this will run for like 99% of the time as a scheduled task the native encrypted version is preferred. As EWS still not accepts something better than basic in https it does not realy matter for security. I am curious why you run into issues with the shared mailbox as my demo is a shared mailbox accessed via a account that has full permissions on the shared mailbox. Please enable the -trace flag and email me your results. Finally for Power BI you can just create the streaming API as described in the manual. Than you can create any report you want dragging colums from the database into new shapes.
Hi! When performing autodiscover, EWS is not as intelligent as Outlook or ActiveSync. It will first hit https://companyname.com/autodi scover/autodisc over.xml which is normally your public website. If there's a custom 404 page or a redirect, EWS, unlike Microsoft Remote Connectivity Analyzer or Outlook, will not realize it's irrelevant, will try to parse it, fail, then throw something like "The expected XML node type was XmlDeclaration, but the actual type is Element." I worked around this by hard-coding: $EWSconnection. Url = 'https://outloo k.office365.com /EWS/Exchange.a smx' For pure cloud scenarios, where there's no SCPs, you may consider adding -isO365 switch of some sort to bypass autodiscover as EWS URL in O365 is always the same. Overall, great script, thank you! ~Egor
Hoi Egor, Thx for the great feedback, if you have any more let me know. I will consider this feature request in new release.
functionality is implemented in Version 2.1 that will be released soon
Dear Martijn. Thank you for the amazing script it really helps a lot in analyzing the DMARC data, however i have experienced some weird behavior from the results. The report has some Permanent, none and neutral SPF incidents recorded by the MTA which is not expected. To elaborate more, i have seen that some of the incidents show our internal email gateway IPs in the report showing the soft fails for example which is not expected. Please let us know if any modification need to be done in the code inputs to exclude the email filter IPs and not mark them in the analysis or should they be there or not. Regards. Hisham Mezher
DMARC does not filter your own MTA. If you get reports back that your own MTA ip failed a check that is a issue you should investigate. This can have multiple causes and Is hard for me to analyse without having data. If you want to reduce the amount of data you can also use the switch -DMARCfailedonly so items that pass DMARC will not be reported. or use the -PowerBIuploade nabled and configure PowerBi to get even more richer report where you can filter out data.